Science.gov

Sample records for investigators independently coded

  1. Independent peer review of nuclear safety computer codes

    SciTech Connect

    Boyack, B.E.; Jenks, R.P.

    1993-02-01

    A structured process of independent computer code peer review has been developed to assist the US Nuclear Regulatory Commission (NRC) and the US Department of Energy in their nuclear safety missions. This paper focuses on the process that evolved during recent reviews of NRC codes.

  2. The design of relatively machine-independent code generators

    NASA Technical Reports Server (NTRS)

    Noonan, R. E.

    1979-01-01

    Two complementary approaches were investigated. In the first approach software design techniques were used to design the structure of a code generator for Halmat. The major result was the development of an intermediate code form known as 7UP. The second approach viewed the problem as one in providing a tool to the code generator programmer. The major result was the development of a non-procedural, problem oriented language known as CGGL (Code Generator Generator Language).

  3. An Investigation of Different String Coding Methods.

    ERIC Educational Resources Information Center

    Goyal, Pankaj

    1984-01-01

    Investigates techniques for automatic coding of English language strings which involve titles drawn from bibliographic files, but do not require prior knowledge of source. Coding methods (basic, maximum entropy principle), results of test using 6,260 titles from British National Bibliography, and variations in code element ordering are…

  4. Investigation of Near Shannon Limit Coding Schemes

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Kim, J.; Mo, Fan

    1999-01-01

    Turbo codes can deliver performance that is very close to the Shannon limit. This report investigates algorithms for convolutional turbo codes and block turbo codes. Both coding schemes can achieve performance near Shannon limit. The performance of the schemes is obtained using computer simulations. There are three sections in this report. First section is the introduction. The fundamental knowledge about coding, block coding and convolutional coding is discussed. In the second section, the basic concepts of convolutional turbo codes are introduced and the performance of turbo codes, especially high rate turbo codes, is provided from the simulation results. After introducing all the parameters that help turbo codes achieve such a good performance, it is concluded that output weight distribution should be the main consideration in designing turbo codes. Based on the output weight distribution, the performance bounds for turbo codes are given. Then, the relationships between the output weight distribution and the factors like generator polynomial, interleaver and puncturing pattern are examined. The criterion for the best selection of system components is provided. The puncturing pattern algorithm is discussed in detail. Different puncturing patterns are compared for each high rate. For most of the high rate codes, the puncturing pattern does not show any significant effect on the code performance if pseudo - random interleaver is used in the system. For some special rate codes with poor performance, an alternative puncturing algorithm is designed which restores their performance close to the Shannon limit. Finally, in section three, for iterative decoding of block codes, the method of building trellis for block codes, the structure of the iterative decoding system and the calculation of extrinsic values are discussed.

  5. Color coding information: assessing alternative coding systems using independent brightness and hue dimensions.

    PubMed

    Jameson, K A; Kaiwi, J L; Bamber, D

    2001-06-01

    Can independent dimensions of brightness and hue be used in a combined digital information code? This issue was addressed by developing 2 color-coding systems and testing them on informed and naive participants in signal beam detection and classification experiments for simulated sonar displays. Each coding system's results showed both groups efficiently used encoded information that varied simultaneously along the 2 dimensions of brightness and hue. Findings support the proposed procedures for developing color information codes and the validity of such information codes across different populations. Applied significance of these results is provided by the test of principled methods of color-code construction and the demonstration that extending the information content of user interfaces beyond 1 dimension is feasible in practice.

  6. Independent rate and temporal coding in hippocampal pyramidal cells

    PubMed Central

    Huxter, John; Burgess, Neil; O’Keefe, John

    2009-01-01

    Hippocampal pyramidal cells use temporal 1 as well as rate coding 2 to signal spatial aspects of the animal’s environment or behaviour. The temporal code takes the form of a phase relationship to the concurrent cycle of the hippocampal EEG theta rhythm (Figure 1​; 1). These two codes could each represent a different variable 3,4. However, this requires that rate and phase can vary independently, in contrast to recent suggestions 5,6 that they are tightly coupled: both reflecting the amplitude of the cell’s input. Here we show that the time of firing and firing rate are dissociable and can represent two independent variables, viz, the animal’s location within the place field and its speed of movement through the field, respectively. Independent encoding of location together with actions and stimuli occurring there may help to explain the dual roles of the hippocampus in spatial and episodic memory 7 8 or a more general role in relational/declarative memory9,10. PMID:14574410

  7. Benchmark testing and independent verification of the VS2DT computer code

    SciTech Connect

    McCord, J.T.; Goodrich, M.T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code`s original documentation.

  8. Quantum image coding with a reference-frame-independent scheme

    NASA Astrophysics Data System (ADS)

    Chapeau-Blondeau, François; Belin, Etienne

    2016-07-01

    For binary images, or bit planes of non-binary images, we investigate the possibility of a quantum coding decodable by a receiver in the absence of reference frames shared with the emitter. Direct image coding with one qubit per pixel and non-aligned frames leads to decoding errors equivalent to a quantum bit-flip noise increasing with the misalignment. We show the feasibility of frame-invariant coding by using for each pixel a qubit pair prepared in one of two controlled entangled states. With just one common axis shared between the emitter and receiver, exact decoding for each pixel can be obtained by means of two two-outcome projective measurements operating separately on each qubit of the pair. With strictly no alignment information between the emitter and receiver, exact decoding can be obtained by means of a two-outcome projective measurement operating jointly on the qubit pair. In addition, the frame-invariant coding is shown much more resistant to quantum bit-flip noise compared to the direct non-invariant coding. For a cost per pixel of two (entangled) qubits instead of one, complete frame-invariant image coding and enhanced noise resistance are thus obtained.

  9. Benchmark testing and independent verification of the VS2DT computer code

    NASA Astrophysics Data System (ADS)

    McCord, James T.; Goodrich, Michael T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code's original documentation.

  10. Independent Population Coding of Speech with Sub-Millisecond Precision

    PubMed Central

    Garcia-Lazaro, Jose A.; Belliveau, Lucile A. C.

    2013-01-01

    To understand the strategies used by the brain to analyze complex environments, we must first characterize how the features of sensory stimuli are encoded in the spiking of neuronal populations. Characterizing a population code requires identifying the temporal precision of spiking and the extent to which spiking is correlated, both between cells and over time. In this study, we characterize the population code for speech in the gerbil inferior colliculus (IC), the hub of the auditory system where inputs from parallel brainstem pathways are integrated for transmission to the cortex. We find that IC spike trains can carry information about speech with sub-millisecond precision, and, consequently, that the temporal correlations imposed by refractoriness can play a significant role in shaping spike patterns. We also find that, in contrast to most other brain areas, the noise correlations between IC cells are extremely weak, indicating that spiking in the population is conditionally independent. These results demonstrate that the problem of understanding the population coding of speech can be reduced to the problem of understanding the stimulus-driven spiking of individual cells, suggesting that a comprehensive model of the subcortical processing of speech may be attainable in the near future. PMID:24305831

  11. Genetic code deviations in the ciliates: evidence for multiple and independent events.

    PubMed Central

    Tourancheau, A B; Tsao, N; Klobutcher, L A; Pearlman, R E; Adoutte, A

    1995-01-01

    In several species of ciliates, the universal stop codons UAA and UAG are translated into glutamine, while in the euplotids, the glutamine codon usage is normal, but UGA appears to be translated as cysteine. Because the emerging position of this monophyletic group in the eukaryotic lineage is relatively late, this deviant genetic code represents a derived state of the universal code. The question is therefore raised as to how these changes arose within the evolutionary pathways of the phylum. Here, we have investigated the presence of stop codons in alpha tubulin and/or phosphoglycerate kinase gene coding sequences from diverse species of ciliates scattered over the phylogenetic tree constructed from 28S rRNA sequences. In our data set, when deviations occur they correspond to in frame UAA and UAG coding for glutamine. By combining these new data with those previously reported, we show that (i) utilization of UAA and UAG codons occurs to different extents between, but also within, the different classes of ciliates and (ii) the resulting phylogenetic pattern of deviations from the universal code cannot be accounted for by a scenario involving a single transition to the unusual code. Thus, contrary to expectations, deviations from the universal genetic code have arisen independently several times within the phylum. PMID:7621837

  12. Species independence of mutual information in coding and noncoding DNA

    NASA Astrophysics Data System (ADS)

    Grosse, Ivo; Herzel, Hanspeter; Buldyrev, Sergey V.; Stanley, H. Eugene

    2000-05-01

    We explore if there exist universal statistical patterns that are different in coding and noncoding DNA and can be found in all living organisms, regardless of their phylogenetic origin. We find that (i) the mutual information function I has a significantly different functional form in coding and noncoding DNA. We further find that (ii) the probability distributions of the average mutual information I¯ are significantly different in coding and noncoding DNA, while (iii) they are almost the same for organisms of all taxonomic classes. Surprisingly, we find that I¯ is capable of predicting coding regions as accurately as organism-specific coding measures.

  13. Investigating the Simulink Auto-Coding Process

    NASA Technical Reports Server (NTRS)

    Gualdoni, Matthew J.

    2016-01-01

    the program; additionally, this is lost time that could be spent testing and analyzing the code. This is one of the more prominent issues with the auto-coding process, and while much information is available with regard to optimizing Simulink designs to produce efficient and reliable C++ code, not much research has been made public on how to reduce the code generation time. It is of interest to develop some insight as to what causes code generation times to be so significant, and determine if there are architecture guidelines or a desirable auto-coding configuration set to assist in streamlining this step of the design process for particular applications. To address the issue at hand, the Simulink coder was studied at a foundational level. For each different component type made available by the software, the features, auto-code generation time, and the format of the generated code were analyzed and documented. Tools were developed and documented to expedite these studies, particularly in the area of automating sequential builds to ensure accurate data was obtained. Next, the Ramses model was examined in an attempt to determine the composition and the types of technologies used in the model. This enabled the development of a model that uses similar technologies, but takes a fraction of the time to auto-code to reduce the turnaround time for experimentation. Lastly, the model was used to run a wide array of experiments and collect data to obtain knowledge about where to search for bottlenecks in the Ramses model. The resulting contributions of the overall effort consist of an experimental model for further investigation into the subject, as well as several automation tools to assist in analyzing the model, and a reference document offering insight to the auto-coding process, including documentation of the tools used in the model analysis, data illustrating some potential problem areas in the auto-coding process, and recommendations on areas or practices in the current

  14. Implementation of context independent code on a new array processor: The Super-65

    NASA Technical Reports Server (NTRS)

    Colbert, R. O.; Bowhill, S. A.

    1981-01-01

    The feasibility of rewriting standard uniprocessor programs into code which contains no context-dependent branches is explored. Context independent code (CIC) would contain no branches that might require different processing elements to branch different ways. In order to investigate the possibilities and restrictions of CIC, several programs were recoded into CIC and a four-element array processor was built. This processor (the Super-65) consisted of three 6502 microprocessors and the Apple II microcomputer. The results obtained were somewhat dependent upon the specific architecture of the Super-65 but within bounds, the throughput of the array processor was found to increase linearly with the number of processing elements (PEs). The slope of throughput versus PEs is highly dependent on the program and varied from 0.33 to 1.00 for the sample programs.

  15. Two independent transcription initiation codes overlap on vertebrate core promoters

    NASA Astrophysics Data System (ADS)

    Haberle, Vanja; Li, Nan; Hadzhiev, Yavor; Plessy, Charles; Previti, Christopher; Nepal, Chirag; Gehrig, Jochen; Dong, Xianjun; Akalin, Altuna; Suzuki, Ana Maria; van Ijcken, Wilfred F. J.; Armant, Olivier; Ferg, Marco; Strähle, Uwe; Carninci, Piero; Müller, Ferenc; Lenhard, Boris

    2014-03-01

    A core promoter is a stretch of DNA surrounding the transcription start site (TSS) that integrates regulatory inputs and recruits general transcription factors to initiate transcription. The nature and causative relationship of the DNA sequence and chromatin signals that govern the selection of most TSSs by RNA polymerase II remain unresolved. Maternal to zygotic transition represents the most marked change of the transcriptome repertoire in the vertebrate life cycle. Early embryonic development in zebrafish is characterized by a series of transcriptionally silent cell cycles regulated by inherited maternal gene products: zygotic genome activation commences at the tenth cell cycle, marking the mid-blastula transition. This transition provides a unique opportunity to study the rules of TSS selection and the hierarchy of events linking transcription initiation with key chromatin modifications. We analysed TSS usage during zebrafish early embryonic development at high resolution using cap analysis of gene expression, and determined the positions of H3K4me3-marked promoter-associated nucleosomes. Here we show that the transition from the maternal to zygotic transcriptome is characterized by a switch between two fundamentally different modes of defining transcription initiation, which drive the dynamic change of TSS usage and promoter shape. A maternal-specific TSS selection, which requires an A/T-rich (W-box) motif, is replaced with a zygotic TSS selection grammar characterized by broader patterns of dinucleotide enrichments, precisely aligned with the first downstream (+1) nucleosome. The developmental dynamics of the H3K4me3-marked nucleosomes reveal their DNA-sequence-associated positioning at promoters before zygotic transcription and subsequent transcription-independent adjustment to the final position downstream of the zygotic TSS. The two TSS-defining grammars coexist, often physically overlapping, in core promoters of constitutively expressed genes to enable

  16. Pcigale: Porting Code Investigating Galaxy Emission to Python

    NASA Astrophysics Data System (ADS)

    Roehlly, Y.; Burgarella, D.; Buat, V.; Boquien, M.; Ciesla, L.; Heinis, S.

    2014-05-01

    We present pcigale, the port to Python of CIGALE (Code Investigating Galaxy Emission) a Fortran spectral energy distribution (SED) fitting code developed at the Laboratoire d'Astrophysique de Marseille. After recalling the specifics of the SED fitting method, we show the gains in modularity and versatility offered by Python, as well as the drawbacks compared to the compiled code.

  17. A coding-independent function of gene and pseudogene mRNAs regulates tumour biology

    PubMed Central

    Poliseno, Laura; Salmena, Leonardo; Zhang, Jiangwen; Carver, Brett; Haveman, William J.; Pandolfi, Pier Paolo

    2011-01-01

    The canonical role of messenger RNA (mRNA) is to deliver protein-coding information to sites of protein synthesis. However, given that microRNAs bind to RNAs, we hypothesized that RNAs possess a biological role in cancer cells that relies upon their ability to compete for microRNA binding and is independent of their protein-coding function. As a paradigm for the protein-coding-independent role of RNAs, we describe the functional relationship between the mRNAs produced by the PTEN tumour suppressor gene and its pseudogene (PTENP1) and the critical consequences of this interaction. We find that PTENP1 is biologically active as determined by its ability to regulate cellular levels of PTEN, and that it can exert a growth-suppressive role. We also show that PTENP1 locus is selectively lost in human cancer. We extend our analysis to other cancer-related genes that possess pseudogenes, such as oncogenic KRAS. Further, we demonstrate that the transcripts of protein coding genes such as PTEN are also biologically active. Together, these findings attribute a novel biological role to expressed pseudogenes, as they can regulate coding gene expression, and reveal a non-coding function for mRNAs. PMID:20577206

  18. An investigation of error characteristics and coding performance

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1993-01-01

    The first year's effort on NASA Grant NAG5-2006 was an investigation to characterize typical errors resulting from the EOS dorn link. The analysis methods developed for this effort were used on test data from a March 1992 White Sands Terminal Test. The effectiveness of a concatenated coding scheme of a Reed Solomon outer code and a convolutional inner code versus a Reed Solomon only code scheme has been investigated as well as the effectiveness of a Periodic Convolutional Interleaver in dispersing errors of certain types. The work effort consisted of development of software that allows simulation studies with the appropriate coding schemes plus either simulated data with errors or actual data with errors. The software program is entitled Communication Link Error Analysis (CLEAN) and models downlink errors, forward error correcting schemes, and interleavers.

  19. Independent verification and validation testing of the FLASH computer code, Versiion 3. 0

    SciTech Connect

    Martian, P.; Chung, J.N. . Dept. of Mechanical and Materials Engineering)

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies.

  20. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    SciTech Connect

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies.

  1. Signal-independent timescale analysis (SITA) and its application for neural coding during reaching and walking.

    PubMed

    Zacksenhouse, Miriam; Lebedev, Mikhail A; Nicolelis, Miguel A L

    2014-01-01

    What are the relevant timescales of neural encoding in the brain? This question is commonly investigated with respect to well-defined stimuli or actions. However, neurons often encode multiple signals, including hidden or internal, which are not experimentally controlled, and thus excluded from such analysis. Here we consider all rate modulations as the signal, and define the rate-modulations signal-to-noise ratio (RM-SNR) as the ratio between the variance of the rate and the variance of the neuronal noise. As the bin-width increases, RM-SNR increases while the update rate decreases. This tradeoff is captured by the ratio of RM-SNR to bin-width, and its variations with the bin-width reveal the timescales of neural activity. Theoretical analysis and simulations elucidate how the interactions between the recovery properties of the unit and the spectral content of the encoded signals shape this ratio and determine the timescales of neural coding. The resulting signal-independent timescale analysis (SITA) is applied to investigate timescales of neural activity recorded from the motor cortex of monkeys during: (i) reaching experiments with Brain-Machine Interface (BMI), and (ii) locomotion experiments at different speeds. Interestingly, the timescales during BMI experiments did not change significantly with the control mode or training. During locomotion, the analysis identified units whose timescale varied consistently with the experimentally controlled speed of walking, though the specific timescale reflected also the recovery properties of the unit. Thus, the proposed method, SITA, characterizes the timescales of neural encoding and how they are affected by the motor task, while accounting for all rate modulations.

  2. An investigation of dehazing effects on image and video coding.

    PubMed

    Gibson, Kristofor B; Võ, Dung T; Nguyen, Truong Q

    2012-02-01

    This paper makes an investigation of the dehazing effects on image and video coding for surveillance systems. The goal is to achieve good dehazed images and videos at the receiver while sustaining low bitrates (using compression) in the transmission pipeline. At first, this paper proposes a novel method for single-image dehazing, which is used for the investigation. It operates at a faster speed than current methods and can avoid halo effects by using the median operation. We then consider the dehazing effects in compression by investigating the coding artifacts and motion estimation in cases of applying any dehazing method before or after compression. We conclude that better dehazing performance with fewer artifacts and better coding efficiency is achieved when the dehazing is applied before compression. Simulations for Joint Photographers Expert Group images in addition to subjective and objective tests with H.264 compressed sequences validate our conclusion. PMID:21896391

  3. Error tolerance of topological codes with independent bit-flip and measurement errors

    NASA Astrophysics Data System (ADS)

    Andrist, Ruben S.; Katzgraber, Helmut G.; Bombin, H.; Martin-Delgado, M. A.

    2016-07-01

    Topological quantum error correction codes are currently among the most promising candidates for efficiently dealing with the decoherence effects inherently present in quantum devices. Numerically, their theoretical error threshold can be calculated by mapping the underlying quantum problem to a related classical statistical-mechanical spin system with quenched disorder. Here, we present results for the general fault-tolerant regime, where we consider both qubit and measurement errors. However, unlike in previous studies, here we vary the strength of the different error sources independently. Our results highlight peculiar differences between toric and color codes. This study complements previous results published in New J. Phys. 13, 083006 (2011), 10.1088/1367-2630/13/8/083006.

  4. Proof-of-principle experiment of reference-frame-independent quantum key distribution with phase coding

    PubMed Central

    Liang, Wen-Ye; Wang, Shuang; Li, Hong-Wei; Yin, Zhen-Qiang; Chen, Wei; Yao, Yao; Huang, Jing-Zheng; Guo, Guang-Can; Han, Zheng-Fu

    2014-01-01

    We have demonstrated a proof-of-principle experiment of reference-frame-independent phase coding quantum key distribution (RFI-QKD) over an 80-km optical fiber. After considering the finite-key bound, we still achieve a distance of 50 km. In this scenario, the phases of the basis states are related by a slowly time-varying transformation. Furthermore, we developed and realized a new decoy state method for RFI-QKD systems with weak coherent sources to counteract the photon-number-splitting attack. With the help of a reference-frame-independent protocol and a Michelson interferometer with Faraday rotator mirrors, our system is rendered immune to the slow phase changes of the interferometer and the polarization disturbances of the channel, making the procedure very robust. PMID:24402550

  5. Streptococcus salivarius ATCC 25975 possesses at least two genes coding for primer-independent glucosyltransferases.

    PubMed Central

    Simpson, C L; Giffard, P M; Jacques, N A

    1995-01-01

    Fractionation of the culture medium showed that Streptococcus salivarius ATCC 25975 secreted a glucosyltransferase (Gtf) that was primer independent. On the basis of this observation, a gene library of S. salivarius chromosomal DNA cloned into lambda L47.1 was screened for a gene(s) coding for such an activity. As a result of this screening process, two new gtf genes, gtfL and gtfM, both of which coded for primer-independent Gtf activities, were isolated. GtfL produced an insoluble glucan that was refractory to digestion by the endo-(1-->6)-alpha-D-glucanase. of Chaetonium gracile, while GtfM produced a soluble glucan that was readily degraded by the glucanase. Comparison of the deduced amino acid sequences of gtfL and gtfM with 10 other available Gtf sequences allowed the relatedness of the conserved catalytic regions to be assessed. This analysis showed that the 12 enzymes did not form clusters based on their primer dependencies or on their product solubilities. Further analysis of the YG repeats in the C-terminal glucan-binding domains of GtfJ, GtfK, GtfL, and GtfM from S. salivarius showed that there was strong homology between a block of contiguous triplet YG repeats present in the four alleles. These blocks of YG repeats were coded for by a region of each gene that appeared to have arisen as a result of a recent duplication event(s). PMID:7822030

  6. The investigation of bandwidth efficient coding and modulation techniques

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The New Mexico State University Center for Space Telemetering and Telecommunications systems has been, and is currently, engaged in the investigation of trellis-coded modulation (TCM) communication systems. In particular, TCM utilizing M-ary phase shift keying is being studied. The study of carrier synchronization in a TCM environment, or in MPSK systems in general, has been one of the two main thrusts of this grant. This study has involved both theoretical modelling and software simulation of the carrier synchronization problem.

  7. Investigating Lossy Image Coding Using the PLHaar Transform

    SciTech Connect

    Senecal, J G; Lindstrom, P; Duchaineau, M A; Joy, K I

    2004-11-16

    We developed the Piecewise-Linear Haar (PLHaar) transform, an integer wavelet-like transform. PLHaar does not have dynamic range expansion, i.e. it is an n-bit to n-bit transform. To our knowledge PLHaar is the only reversible n-bit to n-bit transform that is suitable for lossy and lossless coding. We are investigating PLHaar's use in lossy image coding. Preliminary results from thresholding transform coefficients show that PLHaar does not produce objectionable artifacts like prior n-bit to n-bit transforms, such as the transform of Chao et al. (CFH). Also, at lower bitrates PLHaar images have increased contrast. For a given set of CFH and PLHaar coefficients with equal entropy, the PLHaar reconstruction is more appealing, although the PSNR may be lower.

  8. Independent assessment of TRAC and RELAP5 codes through separate effects tests

    SciTech Connect

    Saha, P.; Rohatgi, U.S.; Jo, J.H.; Neymotin, L.; Slovik, G.; Yuelys-Miksis, C.; Pu, J.

    1983-01-01

    Independent assessment of TRAC-PF1 (Version 7.0), TRAC-BD1 (Version 12.0) and RELAP5/MOD1 (Cycle 14) that was initiated at BNL in FY 1982, has been completed in FY 1983. As in the previous years, emphasis at Brookhaven has been in simulating various separate-effects tests with these advanced codes and identifying the areas where further thermal-hydraulic modeling improvements are needed. The following six catetories of tests were simulated with the above codes: (1) critical flow tests (Moby-Dick nitrogen-water, BNL flashing flow, Marviken Test 24); (2) Counter-Current Flow Limiting (CCFL) tests (University of Houston, Dartmouth College single and parallel tube test); (3) level swell tests (G.E. large vessel test); (4) steam generator tests (B and W 19-tube model S.G. tests, FLECHT-SEASET U-tube S.G. tests); (5) natural circulation tests (FRIGG loop tests); and (6) post-CHF tests (Oak Ridge steady-state test).

  9. Independent code assessment at BNL in FY 1982. [TRAC-PF1; RELAP5/MOD1; TRAC-BD1

    SciTech Connect

    Saha, P.; Rohatgi, U.S.; Jo, J.H.; Neymotin, L.; Slovik, G.; Yuelys-Miksis, C.

    1982-01-01

    Independent assessment of the advanced codes such as TRAC and RELAP5 has continued at BNL through the Fiscal Year 1982. The simulation tests can be grouped into the following five categories: critical flow, counter-current flow limiting (CCFL) or flooding, level swell, steam generator thermal performance, and natural circulation. TRAC-PF1 (Version 7.0) and RELAP5/MOD1 (Cycle 14) codes were assessed by simulating all of the above experiments, whereas the TRAC-BD1 (Version 12.0) code was applied only to the CCFL tests. Results and conclusions of the BNL code assessment activity of FY 1982 are summarized below.

  10. Board Governance of Independent Schools: A Framework for Investigation

    ERIC Educational Resources Information Center

    McCormick, John; Barnett, Kerry; Alavi, Seyyed Babak; Newcombe, Geoffrey

    2006-01-01

    Purpose: This paper develops a theoretical framework to guide future inquiry into board governance of independent schools. Design/methodology/approach: The authors' approach is to integrate literatures related to corporate and educational boards, motivation, leadership and group processes that are appropriate for conceptualizing independent school…

  11. Independence.

    ERIC Educational Resources Information Center

    Stephenson, Margaret E.

    2000-01-01

    Discusses the four planes of development and the periods of creation and crystallization within each plane. Identifies the type of independence that should be achieved by the end of the first two planes of development. Maintains that it is through individual work on the environment that one achieves independence. (KB)

  12. An investigation of error characteristics and coding performance

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1992-01-01

    The performance of forward error correcting coding schemes on errors anticipated for the Earth Observation System (EOS) Ku-band downlink are studied. The EOS transmits picture frame data to the ground via the Telemetry Data Relay Satellite System (TDRSS) to a ground-based receiver at White Sands. Due to unintentional RF interference from other systems operating in the Ku band, the noise at the receiver is non-Gaussian which may result in non-random errors output by the demodulator. That is, the downlink channel cannot be modeled by a simple memoryless Gaussian-noise channel. From previous experience, it is believed that those errors are bursty. The research proceeded by developing a computer based simulation, called Communication Link Error ANalysis (CLEAN), to model the downlink errors, forward error correcting schemes, and interleavers used with TDRSS. To date, the bulk of CLEAN was written, documented, debugged, and verified. The procedures for utilizing CLEAN to investigate code performance were established and are discussed.

  13. High performance computing aspects of a dimension independent semi-Lagrangian discontinuous Galerkin code

    NASA Astrophysics Data System (ADS)

    Einkemmer, Lukas

    2016-05-01

    The recently developed semi-Lagrangian discontinuous Galerkin approach is used to discretize hyperbolic partial differential equations (usually first order equations). Since these methods are conservative, local in space, and able to limit numerical diffusion, they are considered a promising alternative to more traditional semi-Lagrangian schemes (which are usually based on polynomial or spline interpolation). In this paper, we consider a parallel implementation of a semi-Lagrangian discontinuous Galerkin method for distributed memory systems (so-called clusters). Both strong and weak scaling studies are performed on the Vienna Scientific Cluster 2 (VSC-2). In the case of weak scaling we observe a parallel efficiency above 0.8 for both two and four dimensional problems and up to 8192 cores. Strong scaling results show good scalability to at least 512 cores (we consider problems that can be run on a single processor in reasonable time). In addition, we study the scaling of a two dimensional Vlasov-Poisson solver that is implemented using the framework provided. All of the simulations are conducted in the context of worst case communication overhead; i.e., in a setting where the CFL (Courant-Friedrichs-Lewy) number increases linearly with the problem size. The framework introduced in this paper facilitates a dimension independent implementation of scientific codes (based on C++ templates) using both an MPI and a hybrid approach to parallelization. We describe the essential ingredients of our implementation.

  14. A coding-independent function of an alternative Ube3a transcript during neuronal development.

    PubMed

    Valluy, Jeremy; Bicker, Silvia; Aksoy-Aksel, Ayla; Lackinger, Martin; Sumer, Simon; Fiore, Roberto; Wüst, Tatjana; Seffer, Dominik; Metge, Franziska; Dieterich, Christoph; Wöhr, Markus; Schwarting, Rainer; Schratt, Gerhard

    2015-05-01

    The E3 ubiquitin ligase Ube3a is an important regulator of activity-dependent synapse development and plasticity. Ube3a mutations cause Angelman syndrome and have been associated with autism spectrum disorders (ASD). However, the biological significance of alternative Ube3a transcripts generated in mammalian neurons remains unknown. We report here that Ube3a1 RNA, a transcript that encodes a truncated Ube3a protein lacking catalytic activity, prevents exuberant dendrite growth and promotes spine maturation in rat hippocampal neurons. Surprisingly, Ube3a1 RNA function was independent of its coding sequence but instead required a unique 3' untranslated region and an intact microRNA pathway. Ube3a1 RNA knockdown increased activity of the plasticity-regulating miR-134, suggesting that Ube3a1 RNA acts as a dendritic competing endogenous RNA. Accordingly, the dendrite-growth-promoting effect of Ube3a1 RNA knockdown in vivo is abolished in mice lacking miR-134. Taken together, our results define a noncoding function of an alternative Ube3a transcript in dendritic protein synthesis, with potential implications for Angelman syndrome and ASD. PMID:25867122

  15. Academic Integrity in Honor Code and Non-Honor Code Environments: A Qualitative Investigation.

    ERIC Educational Resources Information Center

    McCabe, Donald; Trevino, Linda Klebe; Butterfield, Kenneth D.

    1999-01-01

    Survey data from 4,285 students in 31 colleges and universities indicates students at schools with academic honor codes view the issue of academic integrity in a fundamentally different way than do students at non-honor code institutions. This difference seems to stem from the presence of an honor code and its influence on the way students think…

  16. Characterization and validation of a Monte Carlo code for independent dose calculation in proton therapy treatments with pencil beam scanning

    NASA Astrophysics Data System (ADS)

    Fracchiolla, F.; Lorentini, S.; Widesott, L.; Schwarz, M.

    2015-11-01

    We propose a method of creating and validating a Monte Carlo (MC) model of a proton Pencil Beam Scanning (PBS) machine using only commissioning measurements and avoiding the nozzle modeling. Measurements with a scintillating screen coupled with a CCD camera, ionization chamber and a Faraday Cup were used to model the beam in TOPAS without using any machine parameter information but the virtual source distance from the isocenter. Then the model was validated on simple Spread Out Bragg Peaks (SOBP) delivered in water phantom and with six realistic clinical plans (many involving 3 or more fields) on an anthropomorphic phantom. In particular the behavior of the moveable Range Shifter (RS) feature was investigated and its modeling has been proposed. The gamma analysis (3%,3 mm) was used to compare MC, TPS (XiO-ELEKTA) and measured 2D dose distributions (using radiochromic film). The MC modeling proposed here shows good results in the validation phase, both for simple irradiation geometry (SOBP in water) and for modulated treatment fields (on anthropomorphic phantoms). In particular head lesions were investigated and both MC and TPS data were compared with measurements. Treatment plans with no RS always showed a very good agreement with both of them (γ -Passing Rate (PR)  >  95%). Treatment plans in which the RS was needed were also tested and validated. For these treatment plans MC results showed better agreement with measurements (γ -PR  >  93%) than the one coming from TPS (γ -PR  <  88%). This work shows how to simplify the MC modeling of a PBS machine for proton therapy treatments without accounting for any hardware components and proposes a more reliable RS modeling than the one implemented in our TPS. The validation process has shown how this code is a valid candidate for a completely independent treatment plan dose calculation algorithm. This makes the code an important future tool for the patient specific QA verification process.

  17. Characterization and validation of a Monte Carlo code for independent dose calculation in proton therapy treatments with pencil beam scanning.

    PubMed

    Fracchiolla, F; Lorentini, S; Widesott, L; Schwarz, M

    2015-11-01

    We propose a method of creating and validating a Monte Carlo (MC) model of a proton Pencil Beam Scanning (PBS) machine using only commissioning measurements and avoiding the nozzle modeling. Measurements with a scintillating screen coupled with a CCD camera, ionization chamber and a Faraday Cup were used to model the beam in TOPAS without using any machine parameter information but the virtual source distance from the isocenter. Then the model was validated on simple Spread Out Bragg Peaks (SOBP) delivered in water phantom and with six realistic clinical plans (many involving 3 or more fields) on an anthropomorphic phantom. In particular the behavior of the moveable Range Shifter (RS) feature was investigated and its modeling has been proposed. The gamma analysis (3%,3 mm) was used to compare MC, TPS (XiO-ELEKTA) and measured 2D dose distributions (using radiochromic film). The MC modeling proposed here shows good results in the validation phase, both for simple irradiation geometry (SOBP in water) and for modulated treatment fields (on anthropomorphic phantoms). In particular head lesions were investigated and both MC and TPS data were compared with measurements. Treatment plans with no RS always showed a very good agreement with both of them (γ-Passing Rate (PR)  >  95%). Treatment plans in which the RS was needed were also tested and validated. For these treatment plans MC results showed better agreement with measurements (γ-PR  >  93%) than the one coming from TPS (γ-PR  <  88%). This work shows how to simplify the MC modeling of a PBS machine for proton therapy treatments without accounting for any hardware components and proposes a more reliable RS modeling than the one implemented in our TPS. The validation process has shown how this code is a valid candidate for a completely independent treatment plan dose calculation algorithm. This makes the code an important future tool for the patient specific QA verification process.

  18. Norepinephrine Modulates Coding of Complex Vocalizations in the Songbird Auditory Cortex Independent of Local Neuroestrogen Synthesis

    PubMed Central

    Ikeda, Maaya Z.; Jeon, Sung David; Cowell, Rosemary A.

    2015-01-01

    The catecholamine norepinephrine plays a significant role in auditory processing. Most studies to date have examined the effects of norepinephrine on the neuronal response to relatively simple stimuli, such as tones and calls. It is less clear how norepinephrine shapes the detection of complex syntactical sounds, as well as the coding properties of sensory neurons. Songbirds provide an opportunity to understand how auditory neurons encode complex, learned vocalizations, and the potential role of norepinephrine in modulating the neuronal computations for acoustic communication. Here, we infused norepinephrine into the zebra finch auditory cortex and performed extracellular recordings to study the modulation of song representations in single neurons. Consistent with its proposed role in enhancing signal detection, norepinephrine decreased spontaneous activity and firing during stimuli, yet it significantly enhanced the auditory signal-to-noise ratio. These effects were all mimicked by clonidine, an α-2 receptor agonist. Moreover, a pattern classifier analysis indicated that norepinephrine enhanced the ability of single neurons to accurately encode complex auditory stimuli. Because neuroestrogens are also known to enhance auditory processing in the songbird brain, we tested the hypothesis that norepinephrine actions depend on local estrogen synthesis. Neither norepinephrine nor adrenergic receptor antagonist infusion into the auditory cortex had detectable effects on local estradiol levels. Moreover, pretreatment with fadrozole, a specific aromatase inhibitor, did not block norepinephrine's neuromodulatory effects. Together, these findings indicate that norepinephrine enhances signal detection and information encoding for complex auditory stimuli by suppressing spontaneous “noise” activity and that these actions are independent of local neuroestrogen synthesis. PMID:26109659

  19. Investigation of Navier-Stokes code verification and design optimization

    NASA Astrophysics Data System (ADS)

    Vaidyanathan, Rajkumar

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a finer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-epsilonturbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi

  20. Investigation of Navier-Stokes Code Verification and Design Optimization

    NASA Technical Reports Server (NTRS)

    Vaidyanathan, Rajkumar

    2004-01-01

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization

  1. Olfactory coding in Drosophila larvae investigated by cross-adaptation.

    PubMed

    Boyle, Jennefer; Cobb, Matthew

    2005-09-01

    In order to reveal aspects of olfactory coding, the effects of sensory adaptation on the olfactory responses of first-instar Drosophila melanogaster larvae were tested. Larvae were pre-stimulated with a homologous series of acetic esters (C3-C9), and their responses to each of these odours were then measured. The overall patterns suggested that methyl acetate has no specific pathway but was detected by all the sensory pathways studied here, that butyl and pentyl acetate tended to have similar effects to each other and that hexyl acetate was processed separately from the other odours. In a number of cases, cross-adaptation transformed a control attractive response into a repulsive response; in no case was an increase in attractiveness observed. This was investigated by studying changes in dose-response curves following pre-stimulation. These findings are discussed in light of the possible intra- and intercellular mechanisms of adaptation and the advantage of altered sensitivity for the larva. PMID:16155221

  2. Characterization and validation of a Monte Carlo code for independent dose calculation in proton therapy treatments with pencil beam scanning.

    PubMed

    Fracchiolla, F; Lorentini, S; Widesott, L; Schwarz, M

    2015-11-01

    We propose a method of creating and validating a Monte Carlo (MC) model of a proton Pencil Beam Scanning (PBS) machine using only commissioning measurements and avoiding the nozzle modeling. Measurements with a scintillating screen coupled with a CCD camera, ionization chamber and a Faraday Cup were used to model the beam in TOPAS without using any machine parameter information but the virtual source distance from the isocenter. Then the model was validated on simple Spread Out Bragg Peaks (SOBP) delivered in water phantom and with six realistic clinical plans (many involving 3 or more fields) on an anthropomorphic phantom. In particular the behavior of the moveable Range Shifter (RS) feature was investigated and its modeling has been proposed. The gamma analysis (3%,3 mm) was used to compare MC, TPS (XiO-ELEKTA) and measured 2D dose distributions (using radiochromic film). The MC modeling proposed here shows good results in the validation phase, both for simple irradiation geometry (SOBP in water) and for modulated treatment fields (on anthropomorphic phantoms). In particular head lesions were investigated and both MC and TPS data were compared with measurements. Treatment plans with no RS always showed a very good agreement with both of them (γ-Passing Rate (PR)  >  95%). Treatment plans in which the RS was needed were also tested and validated. For these treatment plans MC results showed better agreement with measurements (γ-PR  >  93%) than the one coming from TPS (γ-PR  <  88%). This work shows how to simplify the MC modeling of a PBS machine for proton therapy treatments without accounting for any hardware components and proposes a more reliable RS modeling than the one implemented in our TPS. The validation process has shown how this code is a valid candidate for a completely independent treatment plan dose calculation algorithm. This makes the code an important future tool for the patient specific QA verification process. PMID

  3. Coding-independent regulation of the tumor suppressor PTEN by competing endogenous mRNAs

    PubMed Central

    Tay, Yvonne; Kats, Lev; Salmena, Leonardo; Weiss, Dror; Tan, Shen Mynn; Ala, Ugo; Karreth, Florian; Poliseno, Laura; Provero, Paolo; Di Cunto, Ferdinando; Lieberman, Judy; Rigoutsos, Isidore; Pandolfi, Pier Paolo

    2011-01-01

    SUMMARY Here we demonstrate that protein-coding RNA transcripts can crosstalk by competing for common microRNAs, with microRNA response elements as the foundation of this interaction. We have termed such RNA transcripts as competing endogenous RNAs (ceRNAs). We tested this hypothesis in the context of PTEN, a key tumor suppressor whose abundance determines critical outcomes in tumorigenesis. By a combined computational and experimental approach, we identified and validated endogenous protein-coding transcripts that regulate PTEN, antagonize PI3K/AKT signaling and possess growth and tumor suppressive properties. Notably, we also show that these genes display concordant expression patterns with PTEN and copy number loss in cancers. Our study presents a road map for the prediction and validation of ceRNA activity and networks, and thus imparts a trans-regulatory function to protein-coding mRNAs. PMID:22000013

  4. Investigation of a panel code for airframe/propeller integration analyses

    NASA Technical Reports Server (NTRS)

    Miley, S. J.

    1982-01-01

    The Hess panel code was investigated as a procedure to predict the aerodynamic loading associated with propeller slipstream interference on the airframe. The slipstream was modeled as a variable onset flow to the lifting and nonlifting bodies treated by the code. Four sets of experimental data were used for comparisons with the code. The results indicate that the Hess code, in its present form, will give valid solutions for nonuniform onset flows which vary in direction only. The code presently gives incorrect solutions for flows with variations in velocity. Modifications to the code to correct this are discussed.

  5. tRNA acceptor stem and anticodon bases form independent codes related to protein folding.

    PubMed

    Carter, Charles W; Wolfenden, Richard

    2015-06-16

    Aminoacyl-tRNA synthetases recognize tRNA anticodon and 3' acceptor stem bases. Synthetase Urzymes acylate cognate tRNAs even without anticodon-binding domains, in keeping with the possibility that acceptor stem recognition preceded anticodon recognition. Representing tRNA identity elements with two bits per base, we show that the anticodon encodes the hydrophobicity of each amino acid side-chain as represented by its water-to-cyclohexane distribution coefficient, and this relationship holds true over the entire temperature range of liquid water. The acceptor stem codes preferentially for the surface area or size of each side-chain, as represented by its vapor-to-cyclohexane distribution coefficient. These orthogonal experimental properties are both necessary to account satisfactorily for the exposed surface area of amino acids in folded proteins. Moreover, the acceptor stem codes correctly for β-branched and carboxylic acid side-chains, whereas the anticodon codes for a wider range of such properties, but not for size or β-branching. These and other results suggest that genetic coding of 3D protein structures evolved in distinct stages, based initially on the size of the amino acid and later on its compatibility with globular folding in water.

  6. tRNA acceptor stem and anticodon bases form independent codes related to protein folding

    PubMed Central

    Carter, Charles W.; Wolfenden, Richard

    2015-01-01

    Aminoacyl-tRNA synthetases recognize tRNA anticodon and 3′ acceptor stem bases. Synthetase Urzymes acylate cognate tRNAs even without anticodon-binding domains, in keeping with the possibility that acceptor stem recognition preceded anticodon recognition. Representing tRNA identity elements with two bits per base, we show that the anticodon encodes the hydrophobicity of each amino acid side-chain as represented by its water-to-cyclohexane distribution coefficient, and this relationship holds true over the entire temperature range of liquid water. The acceptor stem codes preferentially for the surface area or size of each side-chain, as represented by its vapor-to-cyclohexane distribution coefficient. These orthogonal experimental properties are both necessary to account satisfactorily for the exposed surface area of amino acids in folded proteins. Moreover, the acceptor stem codes correctly for β-branched and carboxylic acid side-chains, whereas the anticodon codes for a wider range of such properties, but not for size or β-branching. These and other results suggest that genetic coding of 3D protein structures evolved in distinct stages, based initially on the size of the amino acid and later on its compatibility with globular folding in water. PMID:26034281

  7. Multigroup Time-Independent Neutron Transport Code System for Plane or Spherical Geometry.

    1986-12-01

    Version 00 PALLAS-PL/SP solves multigroup time-independent one-dimensional neutron transport problems in plane or spherical geometry. The problems solved are subject to a variety of boundary conditions or a distributed source. General anisotropic scattering problems are treated for solving deep-penetration problems in which angle-dependent neutron spectra are calculated in detail.

  8. Investigations with methanobacteria and with evolution of the genetic code

    NASA Technical Reports Server (NTRS)

    Jukes, T. H.

    1986-01-01

    Mycoplasma capricolum was found by Osawa et al. to use UGA as the code of tryptophan and to contain 75% A + T in its DNA. This change could have been from evolutionary pressure to replace C + G by A + T. Numerous studies have been reported of evolution of proteins as measured by amino acid replacements that are observed when homologus proteins, such as hemoglobins from various vertebrates, are compared. These replacements result from nucleotide substitutions in amino acid codons in the corresponding genes. Simultaneously, silent nucleotide substitutions take place that can be studied when sequences of the genes are compared. These silent evolutionary changes take place mostly in third positions of codons. Two types of nucleotide substitutions are recognized: pyrimidine-pyrimidine and purine-purine interchanges (transitions) and pyriidine-purine interchanges (transversions). Silent transitions are favored when a corresponding transversion would produce an amino acid replacement. Conversely, silent transversions are favored by probability when transitions and transversions will both be silent. Extensive examples of these situations have been found in protein genes, and it is evident that transversions in silent positions predominate in family boxes in most of the examples studied. In associated research a streptomycete from cow manure was found to produce an extracellular enzyme capable of lysing the pseudomurein-contining methanogen Methanobacterium formicicum.

  9. ADF95: Tool for automatic differentiation of a FORTRAN code designed for large numbers of independent variables

    NASA Astrophysics Data System (ADS)

    Straka, Christian W.

    2005-06-01

    ADF95 is a tool to automatically calculate numerical first derivatives for any mathematical expression as a function of user defined independent variables. Accuracy of derivatives is achieved within machine precision. ADF95 may be applied to any FORTRAN 77/90/95 conforming code and requires minimal changes by the user. It provides a new derived data type that holds the value and derivatives and applies forward differencing by overloading all FORTRAN operators and intrinsic functions. An efficient indexing technique leads to a reduced memory usage and a substantially increased performance gain over other available tools with operator overloading. This gain is especially pronounced for sparse systems with large number of independent variables. A wide class of numerical simulations, e.g., those employing implicit solvers, can profit from ADF95. Program summaryTitle of program:ADF95 Catalogue identifier: ADVI Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVI Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer for which the program is designed: all platforms with a FORTRAN 95 compiler Programming language used:FORTRAN 95 No. of lines in distributed program, including test data, etc.: 3103 No. of bytes in distributed program, including test data, etc.: 9862 Distribution format: tar.gz Nature of problem: In many areas in the computational sciences first order partial derivatives for large and complex sets of equations are needed with machine precision accuracy. For example, any implicit or semi-implicit solver requires the computation of the Jacobian matrix, which contains the first derivatives with respect to the independent variables. ADF95 is a software module to facilitate the automatic computation of the first partial derivatives of any arbitrarily complex mathematical FORTRAN expression. The program exploits the sparsity inherited by many set of equations thereby enabling faster computations compared to alternate

  10. Independent assessment of TRAC-PD2 and RELAP5/MOD1 codes at BNL in FY 1981. [PWR

    SciTech Connect

    Saha, P; Jo, J H; Neymotin, L; Rohatgi, U S; Slovik, G

    1982-12-01

    This report documents the independent assessment calculations performed with the TRAC-PD2 and RELAP/MOD1 codes at Brookhaven National Laboratory (BNL) during Fiscal Year 1981. A large variety of separate-effects experiments dealing with (1) steady-state and transient critical flow, (2) level swell, (3) flooding and entrainment, (4) steady-state flow boiling, (5) integral economizer once-through steam generator (IEOTSG) performance, (6) bottom reflood, and (7) two-dimensional phase separation of two-phase mixtures were simulated with TRAC-PD2. In addition, the early part of an overcooling transient which occurred at the Rancho Seco nuclear power plant on March 20, 1978 was also computed with an updated version of TRAC-PD2. Three separate-effects tests dealing with (1) transient critical flow, (2) steady-state flow boiling, and (3) IEOTSG performance were also simulated with RELAP5/MOD1 code. Comparisons between the code predictions and the test data are presented.

  11. Two independent retrotransposon insertions at the same site within the coding region of BTK.

    PubMed

    Conley, Mary Ellen; Partain, Julie D; Norland, Shannon M; Shurtleff, Sheila A; Kazazian, Haig H

    2005-03-01

    Insertion of endogenous retrotransposon sequences accounts for approximately 0.2% of disease causing mutations. These insertions are mediated by the reverse transcriptase and endonuclease activity of long interspersed nucleotide (LINE-1) elements. The factors that control the target site selection in insertional mutagenesis are not well understood. In our analysis of 199 unrelated families with proven mutations in BTK, the gene responsible for X-linked agammaglobulinemia, we identified two families with retrotransposon insertions at exactly the same nucleotide within the coding region of BTK. Both insertions, an SVA element and an AluY sequence, occurred 12 bp before the end of exon 9. Both had the typical hallmarks of a retrotransposon insertion including target site duplication and a long poly A tail. The insertion site is flanked by AluSx sequences 1 kb upstream and 1 kb downstream and an unusual 60 bp sequence consisting of only As and Ts is located in intron 9, 60 bp downstream of the insertion. The occurrence of two retrotransposon sequences at exactly the same site suggests that this site is vulnerable to insertional mutagenesis. A better understanding of the factors that make this site vulnerable will shed light on the mechanisms of LINE-1 mediated insertional mutagenesis.

  12. Investigating the Language and Literacy Skills Required for Independent Online Learning

    ERIC Educational Resources Information Center

    Silver-Pacuilla, Heidi

    2008-01-01

    This investigation was undertaken to investigate the threshold levels of literacy and language proficiency necessary for adult learners to use the Internet for independent learning. The report is triangulated around learning from large-scale surveys, learning from the literature, and learning from the field. Reported findings include: (1)…

  13. Semi-device-independent randomness expansion with partially free random sources using 3 →1 quantum random access code

    NASA Astrophysics Data System (ADS)

    Zhou, Yu-Qian; Gao, Fei; Li, Dan-Dan; Li, Xin-Hui; Wen, Qiao-Yan

    2016-09-01

    We have proved that new randomness can be certified by partially free sources using 2 →1 quantum random access code (QRAC) in the framework of semi-device-independent (SDI) protocols [Y.-Q. Zhou, H.-W. Li, Y.-K. Wang, D.-D. Li, F. Gao, and Q.-Y. Wen, Phys. Rev. A 92, 022331 (2015), 10.1103/PhysRevA.92.022331]. To improve the effectiveness of the randomness generation, here we propose the SDI randomness expansion using 3 →1 QRAC and obtain the corresponding classical and quantum bounds of the two-dimensional quantum witness. Moreover, we get the condition which should be satisfied by the partially free sources to successfully certify new randomness, and the analytic relationship between the certified randomness and the two-dimensional quantum witness violation.

  14. Modality independence of order coding in working memory: Evidence from cross-modal order interference at recall.

    PubMed

    Vandierendonck, André

    2016-01-01

    Working memory researchers do not agree on whether order in serial recall is encoded by dedicated modality-specific systems or by a more general modality-independent system. Although previous research supports the existence of autonomous modality-specific systems, it has been shown that serial recognition memory is prone to cross-modal order interference by concurrent tasks. The present study used a serial recall task, which was performed in a single-task condition and in a dual-task condition with an embedded memory task in the retention interval. The modality of the serial task was either verbal or visuospatial, and the embedded tasks were in the other modality and required either serial or item recall. Care was taken to avoid modality overlaps during presentation and recall. In Experiment 1, visuospatial but not verbal serial recall was more impaired when the embedded task was an order than when it was an item task. Using a more difficult verbal serial recall task, verbal serial recall was also more impaired by another order recall task in Experiment 2. These findings are consistent with the hypothesis of modality-independent order coding. The implications for views on short-term recall and the multicomponent view of working memory are discussed.

  15. Approaches to Learning at Work: Investigating Work Motivation, Perceived Workload, and Choice Independence

    ERIC Educational Resources Information Center

    Kyndt, Eva; Raes, Elisabeth; Dochy, Filip; Janssens, Els

    2013-01-01

    Learning and development are taking up a central role in the human resource policies of organizations because of their crucial contribution to the competitiveness of those organizations. The present study investigates the relationship of work motivation, perceived workload, and choice independence with employees' approaches to learning at…

  16. Investigation of the Use of Erasures in a Concatenated Coding Scheme

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Marriott, Philip J.

    1997-01-01

    A new method for declaring erasures in a concatenated coding scheme is investigated. This method is used with the rate 1/2 K = 7 convolutional code and the (255, 223) Reed Solomon code. Errors and erasures Reed Solomon decoding is used. The erasure method proposed uses a soft output Viterbi algorithm and information provided by decoded Reed Solomon codewords in a deinterleaving frame. The results show that a gain of 0.3 dB is possible using a minimum amount of decoding trials.

  17. Binary neutron-star mergers with Whisky and SACRA: First quantitative comparison of results from independent general-relativistic hydrodynamics codes

    NASA Astrophysics Data System (ADS)

    Baiotti, Luca; Shibata, Masaru; Yamamoto, Tetsuro

    2010-09-01

    We present the first quantitative comparison of two independent general-relativistic hydrodynamics codes, the whisky code and the sacra code. We compare the output of simulations starting from the same initial data and carried out with the configuration (numerical methods, grid setup, resolution, gauges) which for each code has been found to give consistent and sufficiently accurate results, in particular, in terms of cleanness of gravitational waveforms. We focus on the quantities that should be conserved during the evolution (rest mass, total mass energy, and total angular momentum) and on the gravitational-wave amplitude and frequency. We find that the results produced by the two codes agree at a reasonable level, with variations in the different quantities but always at better than about 10%.

  18. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Finite Difference Time Domain Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). This manual provides a description of the code and corresponding results for the default scattering problem. In addition to the description, the operation, resource requirements, version A code capabilities, a description of each subroutine, a brief discussion of the radar cross section computations, and a discussion of the scattering results.

  19. Code-Switching in Iranian Elementary EFL Classrooms: An Exploratory Investigation

    ERIC Educational Resources Information Center

    Rezvani, Ehsan; Street, Hezar Jerib; Rasekh, Abbass Eslami

    2011-01-01

    This paper presents the results of a small-scale exploratory investigation of code-switching (CS) between English and Farsi by 4 Iranian English foreign language (EFL) teachers in elementary level EFL classrooms in a language school in Isfahan, Iran. Specifically, the present study aimed at exploring the syntactical identification of switches and…

  20. 78 FR 37571 - Certain Opaque Polymers; Institution of Investigation Pursuant to United States Code

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ... COMMISSION Certain Opaque Polymers; Institution of Investigation Pursuant to United States Code AGENCY: U.S... importation, and the sale within the United States after importation of certain opaque polymers by reason of... importation, or the sale within the United States after importation of certain opaque polymers that...

  1. A Coding System with Independent Annotations of Gesture Forms and Functions during Verbal Communication: Development of a Database of Speech and GEsture (DoSaGE)

    PubMed Central

    Kong, Anthony Pak-Hin; Law, Sam-Po; Kwan, Connie Ching-Yin; Lai, Christy; Lam, Vivian

    2014-01-01

    Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies in the fact that the coding of forms and functions of gestures has not been clearly differentiated. This paper first described a recently developed Database of Speech and GEsture (DoSaGE) based on independent annotation of gesture forms and functions among 119 neurologically unimpaired right-handed native speakers of Cantonese (divided into three age and two education levels), and presented findings of an investigation examining how gesture use was related to age and linguistic performance. Consideration of these two factors, for which normative data are currently very limited or lacking in the literature, is relevant and necessary when one evaluates gesture employment among individuals with and without language impairment. Three speech tasks, including monologue of a personally important event, sequential description, and story-telling, were used for elicitation. The EUDICO Linguistic ANnotator (ELAN) software was used to independently annotate each participant’s linguistic information of the transcript, forms of gestures used, and the function for each gesture. About one-third of the subjects did not use any co-verbal gestures. While the majority of gestures were non-content-carrying, which functioned mainly for reinforcing speech intonation or controlling speech flow, the content-carrying ones were used to enhance speech content. Furthermore, individuals who are younger or linguistically more proficient tended to use fewer gestures, suggesting that normal speakers gesture differently as a function of age and linguistic performance. PMID:25667563

  2. An Early Underwater Artificial Vision Model in Ocean Investigations via Independent Component Analysis

    PubMed Central

    Nian, Rui; Liu, Fang; He, Bo

    2013-01-01

    Underwater vision is one of the dominant senses and has shown great prospects in ocean investigations. In this paper, a hierarchical Independent Component Analysis (ICA) framework has been established to explore and understand the functional roles of the higher order statistical structures towards the visual stimulus in the underwater artificial vision system. The model is inspired by characteristics such as the modality, the redundancy reduction, the sparseness and the independence in the early human vision system, which seems to respectively capture the Gabor-like basis functions, the shape contours or the complicated textures in the multiple layer implementations. The simulation results have shown good performance in the effectiveness and the consistence of the approach proposed for the underwater images collected by autonomous underwater vehicles (AUVs). PMID:23863855

  3. An early underwater artificial vision model in ocean investigations via independent component analysis.

    PubMed

    Nian, Rui; Liu, Fang; He, Bo

    2013-01-01

    Underwater vision is one of the dominant senses and has shown great prospects in ocean investigations. In this paper, a hierarchical Independent Component Analysis (ICA) framework has been established to explore and understand the functional roles of the higher order statistical structures towards the visual stimulus in the underwater artificial vision system. The model is inspired by characteristics such as the modality, the redundancy reduction, the sparseness and the independence in the early human vision system, which seems to respectively capture the Gabor-like basis functions, the shape contours or the complicated textures in the multiple layer implementations. The simulation results have shown good performance in the effectiveness and the consistence of the approach proposed for the underwater images collected by autonomous underwater vehicles (AUVs). PMID:23863855

  4. An early underwater artificial vision model in ocean investigations via independent component analysis.

    PubMed

    Nian, Rui; Liu, Fang; He, Bo

    2013-07-16

    Underwater vision is one of the dominant senses and has shown great prospects in ocean investigations. In this paper, a hierarchical Independent Component Analysis (ICA) framework has been established to explore and understand the functional roles of the higher order statistical structures towards the visual stimulus in the underwater artificial vision system. The model is inspired by characteristics such as the modality, the redundancy reduction, the sparseness and the independence in the early human vision system, which seems to respectively capture the Gabor-like basis functions, the shape contours or the complicated textures in the multiple layer implementations. The simulation results have shown good performance in the effectiveness and the consistence of the approach proposed for the underwater images collected by autonomous underwater vehicles (AUVs).

  5. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional electromagnetic scattering codes based on the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  6. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional numerical electromagnetic scattering codes based upon the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current two dimensional FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem set section, a new problem checklist, references and figure titles.

  7. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain (FDTD) Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain technique. The supplied version of the code is one version of our current three dimensional FDTD code set. The manual provides a description of the code and the corresponding results for the default scattering problem. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version A code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONA.FOR), a section briefly discussing radar cross section (RCS) computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  8. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three-dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain (FDTD) technique. The supplied version of the code is one version of our current three-dimensional FDTD code set. The manual given here provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing radar cross section computations, a section discussing some scattering results, a new problem checklist, references, and figure titles.

  9. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  10. Investigating the Magnetorotational Instability with Dedalus, and Open-Souce Hydrodynamics Code

    SciTech Connect

    Burns, Keaton J; /UC, Berkeley, aff SLAC

    2012-08-31

    The magnetorotational instability is a fluid instability that causes the onset of turbulence in discs with poloidal magnetic fields. It is believed to be an important mechanism in the physics of accretion discs, namely in its ability to transport angular momentum outward. A similar instability arising in systems with a helical magnetic field may be easier to produce in laboratory experiments using liquid sodium, but the applicability of this phenomenon to astrophysical discs is unclear. To explore and compare the properties of these standard and helical magnetorotational instabilities (MRI and HRMI, respectively), magnetohydrodynamic (MHD) capabilities were added to Dedalus, an open-source hydrodynamics simulator. Dedalus is a Python-based pseudospectral code that uses external libraries and parallelization with the goal of achieving speeds competitive with codes implemented in lower-level languages. This paper will outline the MHD equations as implemented in Dedalus, the steps taken to improve the performance of the code, and the status of MRI investigations using Dedalus.

  11. Investigation of aerodynamic characteristics of wings having vortex flow using different numerical codes

    NASA Technical Reports Server (NTRS)

    Reddy, C. S.; Goglia, G. L.

    1981-01-01

    The aerodynamic characteristics of highly sweptback wings having separation-induced vortex flow were investigated by employing different numerical codes with a view to determining some of the capabilities and limitations of these codes. Flat wings of various configurations-strake wing models, cropped, diamond, arrow and double delta wings, were studied. Cambered and cranked planforms have also been tested. The theoretical results predicted by the codes were compared with the experimental data, wherever possible, and found to agree favorably for most of the configurations investigated. However, large cambered wings could not be successfully modeled by the codes. It appears that the final solution in the free vortex sheet method is affected by the selection of the initial solution. Accumulated span loadings estimated for delta and diamond wings were found to be unusual in comparison with attached flow results in that the slopes of these load curves near the leading edge do not tend to infinity as they do in the case of attached flow.

  12. Final report of the independent counsel for Iran/Contra matters. Volume 1: Investigations and prosecutions

    SciTech Connect

    Walsh, L.E.

    1993-08-04

    In October and November 1986, two secret U.S. Government operations were publicly exposed, potentially implicating Reagan Administration officials in illegal activities. These operations were the provision of assistance to the military activities of the Nicaraguan contra rebels during an October 1984 to October 1986 prohibition on such aid, and the sale of U.S. arms to Iran in contravention of stated U.S. policy and in possible violation of arms-export controls. In late November 1986, Reagan Administration officials announced that some of the proceeds from the sale of U.S. arms to Iran had been diverted to the contras. As a result of the exposure of these operations, Attorney General Edwin Meese III sought the appointment of an independent counsel to investigate and, if necessary, prosecute possible crimes arising from them. This is the final report of that investigation.

  13. RACE, CODE OF THE STREET, AND VIOLENT DELINQUENCY: A MULTILEVEL INVESTIGATION OF NEIGHBORHOOD STREET CULTURE AND INDIVIDUAL NORMS OF VIOLENCE*

    PubMed Central

    Stewart, Eric A.; Simons, Ronald L.

    2011-01-01

    The study outlined in this article drew on Elijah Anderson’s (1999) code of the street perspective to examine the impact of neighborhood street culture on violent delinquency. Using data from more than 700 African American adolescents, we examined 1) whether neighborhood street culture predicts adolescent violence above and beyond an adolescent’s own street code values and 2) whether neighborhood street culture moderates individual-level street code values on adolescent violence. Consistent with Anderson’s hypotheses, neighborhood street culture significantly predicts violent delinquency independent of individual-level street code effects. Additionally, neighborhood street culture moderates individual-level street code values on violence in neighborhoods where the street culture is widespread. In particular, the effect of street code values on violence is enhanced in neighborhoods where the street culture is endorsed widely. PMID:21666759

  14. Investigation of Beam-RF Interactions in Twisted Waveguide Accelerating Structures Using Beam Tracking Codes

    SciTech Connect

    Holmes, Jeffrey A; Zhang, Yan; Kang, Yoon W; Galambos, John D; Hassan, Mohamed H; Wilson, Joshua L

    2009-01-01

    Investigations of the RF properties of certain twisted waveguide structures show that they support favorable accelerating fields. This makes them potential candidates for accelerating cavities. Using the particle tracking code, ORBIT, We examine the beam - RF interaction in the twisted cavity structures to understand their beam transport and acceleration properties. The results will show the distinctive properties of these new structures for particle transport and acceleration, which have not been previously analyzed.

  15. Coding for stable transmission of W-band radio-over-fiber system using direct-beating of two independent lasers.

    PubMed

    Yang, L G; Sung, J Y; Chow, C W; Yeh, C H; Cheng, K T; Shi, J W; Pan, C L

    2014-10-20

    We demonstrate experimentally Manchester (MC) coding based W-band (75 - 110 GHz) radio-over-fiber (ROF) system to reduce the low-frequency-components (LFCs) signal distortion generated by two independent low-cost lasers using spectral shaping. Hence, a low-cost and higher performance W-band ROF system is achieved. In this system, direct-beating of two independent low-cost CW lasers without frequency tracking circuit (FTC) is used to generate the millimeter-wave. Approaches, such as delayed self-heterodyne interferometer and heterodyne beating are performed to characterize the optical-beating-interference sub-terahertz signal (OBIS). Furthermore, W-band ROF systems using MC coding and NRZ-OOK are compared and discussed.

  16. Your ticket to independence: a guide to getting your first principal investigator position.

    PubMed

    Káradóttir, Ragnhildur Thóra; Letzkus, Johannes J; Mameli, Manuel; Ribeiro, Carlos

    2015-10-01

    The transition to scientific independence as a principal investigator (PI) can seem like a daunting and mysterious process to postdocs and students - something that many aspire to while at the same time wondering how to achieve this goal and what being a PI really entails. The FENS Kavli Network of Excellence (FKNE) is a group of young faculty who have recently completed this step in various fields of neuroscience across Europe. In a series of opinion pieces from FKNE scholars, we aim to demystify this process and to offer the next generation of up-and-coming PIs some advice and personal perspectives on the transition to independence, starting here with guidance on how to get hired to your first PI position. Rather than providing an exhaustive overview of all facets of the hiring process, we focus on a few key aspects that we have learned to appreciate in the quest for our own labs: What makes a research programme exciting and successful? How can you identify great places to apply to and make sure your application stands out? What are the key objectives for the job talk and the interview? How do you negotiate your position? And finally, how do you decide on a host institute that lets you develop both scientifically and personally in your new role as head of a lab?

  17. An investigation of design optimization using a 2-D viscous flow code with multigrid

    NASA Technical Reports Server (NTRS)

    Doria, Michael L.

    1990-01-01

    Computational fluid dynamics (CFD) codes have advanced to the point where they are effective analytical tools for solving flow fields around complex geometries. There is also a need for their use as a design tool to find optimum aerodynamic shapes. In the area of design, however, a difficulty arises due to the large amount of computer resources required by these codes. It is desired to streamline the design process so that a large number of design options and constraints can be investigated without overloading the system. There are several techniques which have been proposed to help streamline the design process. The feasibility of one of these techniques is investigated. The technique under consideration is the interaction of the geometry change with the flow calculation. The problem of finding the value of camber which maximizes the ratio of lift over drag for a particular airfoil is considered. In order to test out this technique, a particular optimization problem was tried. A NACA 0012 airfoil was considered at free stream Mach number of 0.5 with a zero angle of attack. Camber was added to the mean line of the airfoil. The goal was to find the value of camber for which the ratio of lift over drag is a maximum. The flow code used was FLOMGE which is a two dimensional viscous flow solver which uses multigrid to speed up convergence. A hyperbolic grid generation program was used to construct the grid for each value of camber.

  18. Semantic association investigated with fMRI and independent component analysis

    PubMed Central

    Kim, Kwang Ki; Karunanayaka, Prasanna; Privitera, Michael D.; Holland, Scott K.; Szaflarski, Jerzy P.

    2010-01-01

    Semantic association, an essential element of human language, enables discourse and inference. Neuroimaging studies have revealed localization and lateralization of semantic circuitry making substantial contributions to cognitive neuroscience. However, due to methodological limitations, these investigations have only identified individual functional components rather than capturing the behavior of the entire network. To overcome these limitations, we have implemented group independent component analysis (ICA) to investigate the cognitive modules used by healthy adults performing fMRI semantic decision task. When compared to the results of a standard GLM analysis, ICA detected several additional brain regions subserving semantic decision. Eight task-related group ICA maps were identified including left inferior frontal gyrus (BA44/45), middle posterior temporal gyrus (BA39/22), angular gyrus/inferior parietal lobule (BA39/40), posterior cingulate (BA30), bilateral lingual gyrus (BA18/23), inferior frontal gyrus (L>R, BA47), hippocampus with parahippocampal gyrus (L>R, BA35/36) and anterior cingulate (BA32/24). While most of the components were represented bilaterally, we found a single, highly left-lateralized component that included the inferior frontal gyrus and the medial and superior temporal gyri, the angular and supramarginal gyri and the inferior parietal cortex. The presence of these spatially independent ICA components implies functional connectivity and can be equated with their modularity. These results are analyzed and presented in the framework of a biologically plausible theoretical model in preparation for similar analyses in patients with right- or left-hemispheric epilepsies. PMID:21296027

  19. Two mitochondrial genomes from the families Bethylidae and Mutillidae: independent rearrangement of protein-coding genes and higher-level phylogeny of the Hymenoptera.

    PubMed

    Wei, Shu-Jun; Li, Qian; van Achterberg, Kees; Chen, Xue-Xin

    2014-08-01

    In animal mitochondrial genomes, gene arrangements are usually conserved across major lineages but might be rearranged within derived groups, and might provide valuable phylogenetic characters. Here, we sequenced the mitochondrial genomes of Cephalonomia gallicola (Chrysidoidea: Bethylidae) and Wallacidia oculata (Vespoidea: Mutillidae). In Cephalonomia at least 11 tRNA and 2 protein-coding genes were rearranged, which is the first report of protein-coding gene rearrangements in the Aculeata. In the Hymenoptera, three types of protein-coding gene rearrangement events occur, i.e. reversal, transposition and reverse transposition. Venturia (Ichneumonidae) had the greatest number of common intervals with the ancestral gene arrangement pattern, whereas Philotrypesis (Agaonidae) had the fewest. The most similar rearrangement patterns are shared between Nasonia (Pteromalidae) and Philotrypesis, whereas the most differentiated rearrangements occur between Cotesia (Braconidae) and Philotrypesis. It is clear that protein-coding gene rearrangements in the Hymenoptera are evolutionarily independent across the major lineages but are conserved within groups such as the Chalcidoidea. Phylogenetic analyses supported the sister-group relationship of Orrussoidea and Apocrita, Ichneumonoidea and Aculeata, Vespidae and Apoidea, and the paraphyly of Vespoidea. The Evaniomorpha and phylogenetic relationships within Aculeata remain controversial, with discrepancy between analyses using protein-coding and RNA genes.

  20. ALS beamlines for independent investigators: A summary of the capabilities and characteristics of beamlines at the ALS

    SciTech Connect

    Not Available

    1992-08-01

    There are two mods of conducting research at the ALS: To work as a member of a participating research team (PRT). To work as a member of a participating research team (PRT); to work as an independent investigator; PRTs are responsible for building beamlines, end stations, and, in some cases, insertion devices. Thus, PRT members have privileged access to the ALS. Independent investigators will use beamline facilities made available by PRTs. The purpose of this handbook is to describe these facilities.

  1. Flight Investigation of Prescribed Simultaneous Independent Surface Excitations for Real-Time Parameter Identification

    NASA Technical Reports Server (NTRS)

    Moes, Timothy R.; Smith, Mark S.; Morelli, Eugene A.

    2003-01-01

    Near real-time stability and control derivative extraction is required to support flight demonstration of Intelligent Flight Control System (IFCS) concepts being developed by NASA, academia, and industry. Traditionally, flight maneuvers would be designed and flown to obtain stability and control derivative estimates using a postflight analysis technique. The goal of the IFCS concept is to be able to modify the control laws in real time for an aircraft that has been damaged in flight. In some IFCS implementations, real-time parameter identification (PID) of the stability and control derivatives of the damaged aircraft is necessary for successfully reconfiguring the control system. This report investigates the usefulness of Prescribed Simultaneous Independent Surface Excitations (PreSISE) to provide data for rapidly obtaining estimates of the stability and control derivatives. Flight test data were analyzed using both equation-error and output-error PID techniques. The equation-error PID technique is known as Fourier Transform Regression (FTR) and is a frequency-domain real-time implementation. Selected results were compared with a time-domain output-error technique. The real-time equation-error technique combined with the PreSISE maneuvers provided excellent derivative estimation in the longitudinal axis. However, the PreSISE maneuvers as presently defined were not adequate for accurate estimation of the lateral-directional derivatives.

  2. Detailed investigation of Long-Period activity at Campi Flegrei by Convolutive Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Capuano, P.; De Lauro, E.; De Martino, S.; Falanga, M.

    2016-04-01

    This work is devoted to the analysis of seismic signals continuously recorded at Campi Flegrei Caldera (Italy) during the entire year 2006. The radiation pattern associated with the Long-Period energy release is investigated. We adopt an innovative Independent Component Analysis algorithm for convolutive seismic series adapted and improved to give automatic procedures for detecting seismic events often buried in the high-level ambient noise. The extracted waveforms characterized by an improved signal-to-noise ratio allows the recognition of Long-Period precursors, evidencing that the seismic activity accompanying the mini-uplift crisis (in 2006), which climaxed in the three days from 26-28 October, had already started at the beginning of the month of October and lasted until mid of November. Hence, a more complete seismic catalog is then provided which can be used to properly quantify the seismic energy release. To better ground our results, we first check the robustness of the method by comparing it with other blind source separation methods based on higher order statistics; secondly, we reconstruct the radiation patterns of the extracted Long-Period events in order to link the individuated signals directly to the sources. We take advantage from Convolutive Independent Component Analysis that provides basic signals along the three directions of motion so that a direct polarization analysis can be performed with no other filtering procedures. We show that the extracted signals are mainly composed of P waves with radial polarization pointing to the seismic source of the main LP swarm, i.e. a small area in the Solfatara, also in the case of the small-events, that both precede and follow the main activity. From a dynamical point of view, they can be described by two degrees of freedom, indicating a low-level of complexity associated with the vibrations from a superficial hydrothermal system. Our results allow us to move towards a full description of the complexity of

  3. Investigation of NOTCH4 coding region polymorphisms in sporadic inclusion body myositis.

    PubMed

    Scott, Adrian P; Laing, Nigel G; Mastaglia, Frank; Dalakas, Marinos; Needham, Merrilee; Allcock, Richard J N

    2012-09-15

    The NOTCH4 gene, located within the MHC region, is involved in cellular differentiation and has varying effects dependent on tissue type. Coding region polymorphisms haplotypic of the sIBM-associated 8.1 ancestral haplotype were identified in NOTCH4 and genotyped in two different Caucasian sIBM cohorts. In both cohorts the frequency of the minor allele of rs422951 and the 12-repeat variation for rs72555375 was increased and was higher than the frequency of the sIBM-associated allele HLA-DRB1*0301. These NOTCH4 polymorphisms can be considered to be markers for sIBM susceptibility, but require further investigation to determine whether they are directly involved in the disease pathogenesis.

  4. Coding Variants at Hexa-allelic Amino Acid 13 of HLA-DRB1 Explain Independent SNP Associations with Follicular Lymphoma Risk

    PubMed Central

    Foo, Jia Nee; Smedby, Karin E.; Akers, Nicholas K.; Berglund, Mattias; Irwan, Ishak D.; Jia, Xiaoming; Li, Yi; Conde, Lucia; Darabi, Hatef; Bracci, Paige M.; Melbye, Mads; Adami, Hans-Olov; Glimelius, Bengt; Khor, Chiea Chuen; Hjalgrim, Henrik; Padyukov, Leonid; Humphreys, Keith; Enblad, Gunilla; Skibola, Christine F.; de Bakker, Paul I.W.; Liu, Jianjun

    2013-01-01

    Non-Hodgkin lymphoma represents a diverse group of blood malignancies, of which follicular lymphoma (FL) is a common subtype. Previous genome-wide association studies (GWASs) have identified in the human leukocyte antigen (HLA) class II region multiple independent SNPs that are significantly associated with FL risk. To dissect these signals and determine whether coding variants in HLA genes are responsible for the associations, we conducted imputation, HLA typing, and sequencing in three independent populations for a total of 689 cases and 2,446 controls. We identified a hexa-allelic amino acid polymorphism at position 13 of the HLA-DR beta chain that showed the strongest association with FL within the major histocompatibility complex (MHC) region (multiallelic p = 2.3 × 10−15). Out of six possible amino acids that occurred at that position within the population, we classified two as high risk (Tyr and Phe), two as low risk (Ser and Arg), and two as moderate risk (His and Gly). There was a 4.2-fold difference in risk (95% confidence interval = 2.9–6.1) between subjects carrying two alleles encoding high-risk amino acids and those carrying two alleles encoding low-risk amino acids (p = 1.01 × 10−14). This coding variant might explain the complex SNP associations identified by GWASs and suggests a common HLA-DR antigen-driven mechanism for the pathogenesis of FL and rheumatoid arthritis. PMID:23791106

  5. Coding variants at hexa-allelic amino acid 13 of HLA-DRB1 explain independent SNP associations with follicular lymphoma risk.

    PubMed

    Foo, Jia Nee; Smedby, Karin E; Akers, Nicholas K; Berglund, Mattias; Irwan, Ishak D; Jia, Xiaoming; Li, Yi; Conde, Lucia; Darabi, Hatef; Bracci, Paige M; Melbye, Mads; Adami, Hans-Olov; Glimelius, Bengt; Khor, Chiea Chuen; Hjalgrim, Henrik; Padyukov, Leonid; Humphreys, Keith; Enblad, Gunilla; Skibola, Christine F; de Bakker, Paul I W; Liu, Jianjun

    2013-07-11

    Non-Hodgkin lymphoma represents a diverse group of blood malignancies, of which follicular lymphoma (FL) is a common subtype. Previous genome-wide association studies (GWASs) have identified in the human leukocyte antigen (HLA) class II region multiple independent SNPs that are significantly associated with FL risk. To dissect these signals and determine whether coding variants in HLA genes are responsible for the associations, we conducted imputation, HLA typing, and sequencing in three independent populations for a total of 689 cases and 2,446 controls. We identified a hexa-allelic amino acid polymorphism at position 13 of the HLA-DR beta chain that showed the strongest association with FL within the major histocompatibility complex (MHC) region (multiallelic p = 2.3 × 10⁻¹⁵). Out of six possible amino acids that occurred at that position within the population, we classified two as high risk (Tyr and Phe), two as low risk (Ser and Arg), and two as moderate risk (His and Gly). There was a 4.2-fold difference in risk (95% confidence interval = 2.9-6.1) between subjects carrying two alleles encoding high-risk amino acids and those carrying two alleles encoding low-risk amino acids (p = 1.01 × 10⁻¹⁴). This coding variant might explain the complex SNP associations identified by GWASs and suggests a common HLA-DR antigen-driven mechanism for the pathogenesis of FL and rheumatoid arthritis.

  6. Investigation of Cool and Hot Executive Function in ODD/CD Independently of ADHD

    ERIC Educational Resources Information Center

    Hobson, Christopher W.; Scott, Stephen; Rubia, Katya

    2011-01-01

    Background: Children with oppositional defiant disorder/conduct disorder (ODD/CD) have shown deficits in "cool" abstract-cognitive, and "hot" reward-related executive function (EF) tasks. However, it is currently unclear to what extent ODD/CD is associated with neuropsychological deficits, independently of attention deficit hyperactivity disorder…

  7. After a Long-Term Placement: Investigating Educational Achievement, Behaviour, and Transition to Independent Living

    ERIC Educational Resources Information Center

    Dumaret, Annick-Camille; Donati, Pascale; Crost, Monique

    2011-01-01

    This study describes the transition towards independent living of 123 former fostered young people reared for long periods in a private French organisation, SOS Children's Villages. Three generations of care leavers were analysed through a postal survey and interviews. Their narratives show typical pathways after leaving care. Two-thirds became…

  8. Further Investigation of Acoustic Propagation Codes for Three-Dimensional Geometries

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Watson, Willie R.; Jones, Michael G.

    2006-01-01

    The ability to predict fan noise within complex three-dimensional aircraft engine nacelle geometries is a valuable tool in designing and assessing low-noise concepts. This work begins a systematic study to identify the areas of the design space in which propagation codes of varying fidelity may be used effectively to provide efficient design and assessment. An efficient lower-fidelity code is used in conjunction with two higher-fidelity, more computationally intensive methods to solve benchmark problems of increasing complexity. The codes represent a small sampling of the current propagation codes available or under development. Results of this initial study indicate that the lower-fidelity code provides satisfactory results for cases involving low to moderate attenuation rates, whereas, the two higher-fidelity codes perform well across the range of problems.

  9. Investigation of inconsistent ENDF/B-VII.1 independent and cumulative fission product yields with proposed revisions

    SciTech Connect

    Pigni, Marco T; Francis, Matthew W; Gauld, Ian C

    2015-01-01

    A recent implementation of ENDF/B-VII. independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear scheme in the decay sub-library that is not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that are incompatible with the cumulative fission yields in the library, and also with experimental measurements. A comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron induced fission for 235,238U and 239,241Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to evaluate the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. An important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library in the case of stable and long-lived cumulative yields due to the inconsistency of ENDF/B-VII.1 fission p;roduct yield and decay data sub-libraries. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.

  10. Investigation of Inconsistent ENDF/B-VII.1 Independent and Cumulative Fission Product Yields with Proposed Revisions

    SciTech Connect

    Pigni, M.T. Francis, M.W.; Gauld, I.C.

    2015-01-15

    A recent implementation of ENDF/B-VII.1 independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear schemes in the decay sub-library that are not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that do not agree with the cumulative fission yields in the library as well as with experimental measurements. To address these issues, a comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron-induced fission for {sup 235,238}U and {sup 239,241}Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to compare the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. Another important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library for stable and long-lived fission products. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.

  11. “There Are Too Many, but Never Enough": Qualitative Case Study Investigating Routine Coding of Clinical Information in Depression

    PubMed Central

    Cresswell, Kathrin; Morrison, Zoe; Sheikh, Aziz; Kalra, Dipak

    2012-01-01

    Background We sought to understand how clinical information relating to the management of depression is routinely coded in different clinical settings and the perspectives of and implications for different stakeholders with a view to understanding how these may be aligned. Materials and Methods Qualitative investigation exploring the views of a purposefully selected range of healthcare professionals, managers, and clinical coders spanning primary and secondary care. Results Our dataset comprised 28 semi-structured interviews, a focus group, documents relating to clinical coding standards and participant observation of clinical coding activities. We identified a range of approaches to coding clinical information including templates and order entry systems. The challenges inherent in clearly establishing a diagnosis, identifying appropriate clinical codes and possible implications of diagnoses for patients were particularly prominent in primary care. Although a range of managerial and research benefits were identified, there were no direct benefits from coded clinical data for patients or professionals. Secondary care staff emphasized the role of clinical coders in ensuring data quality, which was at odds with the policy drive to increase real-time clinical coding. Conclusions There was overall no evidence of clear-cut direct patient care benefits to inform immediate care decisions, even in primary care where data on patients with depression were more extensively coded. A number of important secondary uses were recognized by healthcare staff, but the coding of clinical data to serve these ends was often poorly aligned with clinical practice and patient-centered considerations. The current international drive to encourage clinical coding by healthcare professionals during the clinical encounter may need to be critically examined. PMID:22937106

  12. Investigating the Use of Quick Response Codes in the Gross Anatomy Laboratory

    ERIC Educational Resources Information Center

    Traser, Courtney J.; Hoffman, Leslie A.; Seifert, Mark F.; Wilson, Adam B.

    2015-01-01

    The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student…

  13. The speed of orthographic processing during lexical decision: electrophysiological evidence for independent coding of letter identity and letter position in visual word recognition.

    PubMed

    Mariol, Marina; Jacques, Corentin; Schelstraete, Marie-Anne; Rossion, Bruno

    2008-07-01

    Adults can decide rapidly if a string of letters is a word or not. However, the exact time course of this discrimination is still an open question. Here we sought to track the time course of this discrimination and to determine how orthographic information -- letter position and letter identity -- is computed during reading. We used a go/no-go lexical decision task while recording event-related potentials (ERPs). Subjects were presented with single words (go trials) and pseudowords (no-go trials), which varied in orthographic conformation, presenting either a double consonant frequently doubled (i.e., "ss") or never doubled (i.e., "zz") (identity factor); and a position of the double consonant was which either legal or illegal (position factor), in a 2 x 2 factorial design. Words and pseudowords clearly differed as early as 230 msec. At this latency, ERP waveforms were modulated both by the identity and by the position of letters: The fronto-central no-go N2 was the smallest in amplitude and peaked the earliest to pseudowords presenting both an illegal double-letter position and an identity never encountered. At this stage, the two factors showed additive effects, suggesting an independent coding. The factors of identity and position of double letters interacted much later in the process, at the P3 level, around 300-400 msec on frontal and central sites, in line with the lexical decision data obtained in the behavioral study. Overall, these results show that the speed of lexical decision may depend on orthographic information coded independently by the identity and position of letters in a word.

  14. Culture-Dependent and -Independent Investigations of Microbial Diversity on Urinary Catheters

    PubMed Central

    Xu, Yijuan; Moser, Claus; Al-Soud, Waleed Abu; Sørensen, Søren; Høiby, Niels; Nielsen, Per Halkjær

    2012-01-01

    Catheter-associated urinary tract infection is caused by bacteria, which ascend the catheter along its external or internal surface to the bladder and subsequently develop into biofilms on the catheter and uroepithelium. Antibiotic-treated bacteria and bacteria residing in biofilm can be difficult to culture. In this study we used culture-based and 16S rRNA gene-based culture-independent methods (fingerprinting, cloning, and pyrosequencing) to determine the microbial diversity of biofilms on 24 urinary catheters. Most of the patients were catheterized for <30 days and had undergone recent antibiotic treatment. In addition, the corresponding urine samples for 16 patients were cultured. We found that gene analyses of the catheters were consistent with cultures of the corresponding urine samples for the presence of bacteria but sometimes discordant for the identity of the species. Cultures of catheter tips detected bacteria more frequently than urine cultures and gene analyses; coagulase-negative staphylococci were, in particular, cultured much more often from catheter tips, indicating potential contamination of the catheter tips during sampling. The external and internal surfaces of 19 catheters were separately analyzed by molecular methods, and discordant results were found in six catheters, suggesting that bacterial colonization intra- and extraluminally may be different. Molecular analyses showed that most of the species identified in this study were known uropathogens, and infected catheters were generally colonized by one to two species, probably due to antibiotic usage and short-term catheterization. In conclusion, our data showed that culture-independent molecular methods did not detect bacteria from urinary catheters more frequently than culture-based methods. PMID:23015674

  15. Investigating the structure preserving encryption of high efficiency video coding (HEVC)

    NASA Astrophysics Data System (ADS)

    Shahid, Zafar; Puech, William

    2013-02-01

    This paper presents a novel method for the real-time protection of new emerging High Efficiency Video Coding (HEVC) standard. Structure preserving selective encryption is being performed in CABAC entropy coding module of HEVC, which is significantly different from CABAC entropy coding of H.264/AVC. In CABAC of HEVC, exponential Golomb coding is replaced by truncated Rice (TR) up to a specific value for binarization of transform coefficients. Selective encryption is performed using AES cipher in cipher feedback mode on a plaintext of binstrings in a context aware manner. The encrypted bitstream has exactly the same bit-rate and is format complaint. Experimental evaluation and security analysis of the proposed algorithm is performed on several benchmark video sequences containing different combinations of motion, texture and objects.

  16. Computer models to support investigations of surface subsidence and associated ground motion induced by underground coal gasification. [STEALTH Codes

    SciTech Connect

    Langland, R.T.; Trent, B.C.

    1981-01-01

    Two computer codes compare surface subsidence induced by underground coal gasification at Hoe Creek, Wyoming, and Centralia, Washington. Calculations with the STEALTH explicit finite-difference code are shown to match equivalent, implicit finite-element method solutions for the removal of underground material. Effects of removing roof material, varying elastic constants, investigating thermal shrinkage, and burning multiple coal seams are studied. A coupled, finite-difference continuum rigid-block caving code is used to model underground opening behavior. Numerical techniques agree qualitatively with empirical studies but, so far, underpredict ground surface displacement. The two methods, numerical and empirical, are most effective when used together. It is recommended that the thermal characteristics of coal measure rock be investigated and that additional calculations be carried out to longer times so that cooling influences can be modeled.

  17. Nye County nuclear waste repository project office independent scientific investigations program. Summary annual report, May 1996--April 1997

    SciTech Connect

    1997-05-01

    This annual summary report, prepared by Multimedia Environmental Technology, Inc. (MET) on behalf of Nye County Nuclear Waste Project Office, summarizes the activities that were performed during the period from May 1, 1996 to April 30, 1997. These activities were conducted in support of the Independent Scientific Investigation Program (ISIP) of Nye County at the Yucca Mountain Site (YMS). The Nye County NWRPO is responsible for protecting the health and safety of the Nye County residents. NWRPO`s on-site representative is responsible for designing and implementing the Independent Scientific Investigation Program (ISIP). Major objectives of the ISIP include: (1) Investigating key issues related to conceptual design and performance of the repository that can have major impact on human health, safety, and the environment. (2) Identifying areas not being addressed adequately by DOE Nye County has identified several key scientific issues of concern that may affect repository design and performance which were not being adequately addressed by DOE. Nye County has been conducting its own independent study to evaluate the significance of these issues.

  18. Nye County Nuclear Waste Repository Project Office independent scientific investigations program annual report, May 1997--April 1998

    SciTech Connect

    1998-07-01

    This annual summary report, prepared by the Nye County Nuclear Waste Repository Project Office (NWRPO), summarizes the activities that were performed during the period from May 1, 1997 to April 30, 1998. These activities were conducted in support of the Independent Scientific Investigation Program (ISIP) of Nye County at the Yucca Mountain Site (YMS). The Nye County NWRPO is responsible for protecting the health and safety of the Nye County residents. NWRPO`s on-site representative is responsible for designing and implementing the Independent Scientific Investigation Program (ISIP). Major objectives of the ISIP include: Investigating key issues related to conceptual design and performance of the repository that can have major impact on human health, safety, and the environment; identifying areas not being addressed adequately by the Department of Energy (DOE). Nye County has identified several key scientific issues of concern that may affect repository design and performance which were not being adequately addressed by DOE. Nye County has been conducting its own independent study to evaluate the significance of these issues. This report summarizes the results of monitoring from two boreholes and the Exploratory Studies Facility (ESF) tunnel that have been instrumented by Nye County since March and April of 1995. The preliminary data and interpretations presented in this report do not constitute and should not be considered as the official position of Nye County. The ISIP presently includes borehole and tunnel instrumentation, monitoring, data analysis, and numerical modeling activities to address the concerns of Nye County.

  19. A Monte Carlo Investigation of the Analysis of Variance Applied to Non-Independent Bernoulli Variates.

    ERIC Educational Resources Information Center

    Draper, John F., Jr.

    The applicability of the Analysis of Variance, ANOVA, procedures to the analysis of dichotomous repeated measure data is described. The design models for which data were simulated in this investigation were chosen to represent simple cases of two experimental situations: situation one, in which subjects' responses to a single randomly selected set…

  20. Investigation of Different Constituent Encoders in a Turbo-code Scheme for Reduced Decoder Complexity

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.

    1998-01-01

    A large number of papers have been published attempting to give some analytical basis for the performance of Turbo-codes. It has been shown that performance improves with increased interleaver length. Also procedures have been given to pick the best constituent recursive systematic convolutional codes (RSCC's). However testing by computer simulation is still required to verify these results. This thesis begins by describing the encoding and decoding schemes used. Next simulation results on several memory 4 RSCC's are shown. It is found that the best BER performance at low E(sub b)/N(sub o) is not given by the RSCC's that were found using the analytic techniques given so far. Next the results are given from simulations using a smaller memory RSCC for one of the constituent encoders. Significant reduction in decoding complexity is obtained with minimal loss in performance. Simulation results are then given for a rate 1/3 Turbo-code with the result that this code performed as well as a rate 1/2 Turbo-code as measured by the distance from their respective Shannon limits. Finally the results of simulations where an inaccurate noise variance measurement was used are given. From this it was observed that Turbo-decoding is fairly stable with regard to noise variance measurement.

  1. Investigating the diversity of pseudomonas spp. in soil using culture dependent and independent techniques.

    PubMed

    Li, Lili; Abu Al-Soud, Waleed; Bergmark, Lasse; Riber, Leise; Hansen, Lars H; Magid, Jakob; Sørensen, Søren J

    2013-10-01

    Less than 1 % of bacterial populations present in environmental samples are culturable, meaning that cultivation will lead to an underestimation of total cell counts and total diversity. However, it is less clear whether this is also true for specific well-defined groups of bacteria for which selective culture media is available. In this study, we use culture dependent and independent techniques to describe whether isolation of Pseudomonas spp. on selective nutrient-poor NAA 1:100 agar-medium can reflect the full diversity, found by pyrosequencing, of the total soil Pseudomonas community in an urban waste field trial experiment. Approximately 3,600 bacterial colonies were isolated using nutrient-poor NAA 1:100 medium from soils treated with different fertilizers; (i) high N-level sewage sludge (SA), (ii) high N-level cattle manure (CMA), and (iii) unfertilized control soil (U). Based on Pseudomonas specific quantitative-PCR and Pseudomonas CFU counts, less than 4 % of Pseudomonas spp. were culturable using NAA 1:100 medium. The Pseudomonas selectivity and specificity of the culture medium were evaluated by 454 pyrosequencing of 16S rRNA gene amplicons generated using Bacteria- and Pseudomonas-specific primers. Pyrosequencing results showed that most isolates were Pseudomonas and that the culturable fraction of Pseudomonas spp. reflects most clusters of the total Pseudomonas diversity in soil. This indicates that NAA 1:100 medium is highly selective for Pseudomonas species, and reveals the ability of NAA 1:100 medium to culture mostly the dominant Pseudomonas species in soil.

  2. A model-independent investigation on quasi-degenerate neutrino mass models and their significance

    NASA Astrophysics Data System (ADS)

    Roy, Subhankar; Singh, N. Nimai

    2013-12-01

    The prediction of possible hierarchy of neutrino masses mostly depends on the model chosen. Dissociating the μ-τ interchange symmetry from discrete flavor symmetry based models, makes the neutrino mass matrix less predictive and motivates one to seek the answer from different phenomenological frameworks. This insists on proper parametrization of the neutrino mass matrices concerning individual hierarchies. In this work, an attempt has been made to study the six different cases of quasi-degenerate (QDN) neutrino models with mass matrices, mLLν parametrized with two free parameters (α,η), standard Wolfenstein parameter (λ) and input mass scale, m0˜0.08 eV. We start with a μ-τ symmetric neutrino mass matrix followed by a correction from charged lepton sector. The parametrization emphasizes on the existence of four independent texture zero building blocks common to all the QDN models under μ-τ symmetric framework and is found to be invariant under any choice of solar angle. In our parametrization, solar angle is controlled from neutrino sector whereas the charged lepton sector drives the reactor and atmospheric mixing angles. The individual models are tested in the framework of oscillation experiments, cosmological observation and future experiments involving β-decay and 0νββ experiments, and any reason to discard the QDN mass models with relatively lower mass is unfounded. Although the QDNH-Type IA model shows strong preference for sin2θ12=0.32, yet this is not sufficient to rule out the other models. The present work leaves a scope to extend the search of most favorable QDN mass model from observed baryon asymmetry of the Universe.

  3. Role asymmetry and code transmission in signaling games: an experimental and computational investigation.

    PubMed

    Moreno, Maggie; Baggio, Giosuè

    2015-07-01

    In signaling games, a sender has private access to a state of affairs and uses a signal to inform a receiver about that state. If no common association of signals and states is initially available, sender and receiver must coordinate to develop one. How do players divide coordination labor? We show experimentally that, if players switch roles at each communication round, coordination labor is shared. However, in games with fixed roles, coordination labor is divided: Receivers adjust their mappings more frequently, whereas senders maintain the initial code, which is transmitted to receivers and becomes the common code. In a series of computer simulations, player and role asymmetry as observed experimentally were accounted for by a model in which the receiver in the first signaling round has a higher chance of adjusting its code than its partner. From this basic division of labor among players, certain properties of role asymmetry, in particular correlations with game complexity, are seen to follow.

  4. Clinical investigation of TROP-2 as an independent biomarker and potential therapeutic target in colon cancer.

    PubMed

    Zhao, Peng; Yu, Hai-Zheng; Cai, Jian-Hui

    2015-09-01

    Colon cancer is associated with a severe demographic and economic burden worldwide. The pathogenesis of colon cancer is highly complex and involves sequential genetic and epigenetic mechanisms. Despite extensive investigation, the pathogenesis of colon cancer remains to be elucidated. As the third most common type of cancer worldwide, the treatment options for colon cancer are currently limited. Human trophoblast cell‑surface marker (TROP‑2), is a cell‑surface transmembrane glycoprotein overexpressed by several types of epithelial carcinoma. In addition, TROP‑2 has been demonstrated to be associated with tumorigenesis and invasiveness in solid types of tumor. The aim of the present study was to investigate the protein expression of TROP‑2 in colon cancer tissues, and further explore the association between the expression of TROP‑2 and clinicopathological features of patients with colon cancer. The expression and localization of the TROP‑2 protein was examined using western blot analysis and immunofluorescence staining. Finally, the expression of TROP‑2 expression was correlated to conventional clinicopathological features of colon cancer using a χ2 test. The results revealed that TROP‑2 protein was expressed at high levels in the colon cancer tissues, which was associated with the development and pathological process of colon cancer. Therefore, TROP‑2 may be used as a biomarker to determine the clinical prognosis, and as a potential therapeutic target in colon cancer.

  5. Write to Read: Investigating the Reading-Writing Relationship of Code-Level Early Literacy Skills

    ERIC Educational Resources Information Center

    Jones, Cindy D.; Reutzel, D. Ray

    2015-01-01

    The purpose of this study was to examine whether the code-related features used in current methods of writing instruction in kindergarten classrooms transfer reading outcomes for kindergarten students. We randomly assigned kindergarten students to 3 instructional groups: a writing workshop group, an interactive writing group, and a control group.…

  6. THE CODE OF THE STREET AND INMATE VIOLENCE: INVESTIGATING THE SALIENCE OF IMPORTED BELIEF SYSTEMS*

    PubMed Central

    MEARS, DANIEL P.; STEWART, ERIC A.; SIENNICK, SONJA E.; SIMONS, RONALD L.

    2013-01-01

    Scholars have long argued that inmate behaviors stem in part from cultural belief systems that they “import” with them into incarcerative settings. Even so, few empirical assessments have tested this argument directly. Drawing on theoretical accounts of one such set of beliefs—the code of the street—and on importation theory, we hypothesize that individuals who adhere more strongly to the street code will be more likely, once incarcerated, to engage in violent behavior and that this effect will be amplified by such incarceration experiences as disciplinary sanctions and gang involvement, as well as the lack of educational programming, religious programming, and family support. We test these hypotheses using unique data that include measures of the street code belief system and incarceration experiences. The results support the argument that the code of the street belief system affects inmate violence and that the effect is more pronounced among inmates who lack family support, experience disciplinary sanctions, and are gang involved. Implications of these findings are discussed. PMID:24068837

  7. Investigation of low temperature solid oxide fuel cells for air-independent UUV applications

    NASA Astrophysics Data System (ADS)

    Moton, Jennie Mariko

    Unmanned underwater vehicles (UUVs) will benefit greatly from high energy density (> 500 Wh/L) power systems utilizing high-energy-density fuels and air-independent oxidizers. Current battery-based systems have limited energy densities (< 400 Wh/L), which motivate development of alternative power systems such as solid oxide fuel cells (SOFCs). SOFC-based power systems have the potential to achieve the required UUV energy densities, and the current study explores how SOFCs based on gadolinia-doped ceria (GDC) electrolytes with operating temperatures of 650°C and lower may operate in the unique environments of a promising UUV power plant. The plant would contain a H 2O2 decomposition reactor to supply humidified O2 to the SOFC cathode and exothermic aluminum/H2O combustor to provide heated humidified H2 fuel to the anode. To characterize low-temperature SOFC performance with these unique O2 and H2 source, SOFC button cells based on nickel/GDC (Gd0.1Ce0.9O 1.95) anodes, GDC electrolytes, and lanthanum strontium cobalt ferrite (La0.6Sr0.4Co0.2Fe0.8O3-δ or LSCF)/GDC cathodes were fabricated and tested for performance and stability with humidity on both the anode and the cathode. Cells were also tested with various reactant concentrations of H2 and O2 to simulate gas depletion down the channel of an SOFC stack. Results showed that anode performance depended primarily on fuel concentration and less on the concentration of the associated increase in product H2O. O 2 depletion with humidified cathode flows also caused significant loss in cell current density at a given voltage. With the humidified flows in either the anode or cathode, stability tests of the button cells at 650 °C showed stable voltage is maintained at low operating current (0.17 A/cm2) at up to 50 % by mole H2O, but at higher current densities (0.34 A/cm2), irreversible voltage degradation occurred at rates of 0.8-3.7 mV/hour depending on exposure time. From these button cell results, estimated average

  8. An Investigation of Two Acoustic Propagation Codes for Three-Dimensional Geometries

    NASA Technical Reports Server (NTRS)

    Nark, D. M.; Watson, W. R.; Jones, M. G.

    2005-01-01

    The ability to predict fan noise within complex three-dimensional aircraft engine nacelle geometries is a valuable tool in studying low-noise designs. Recent years have seen the development of aeroacoustic propagation codes using various levels of approximation to obtain such a capability. In light of this, it is beneficial to pursue a design paradigm that incorporates the strengths of the various tools. The development of a quasi-3D methodology (Q3D-FEM) at NASA Langley has brought these ideas to mind in relation to the framework of the CDUCT-LaRC acoustic propagation and radiation tool. As more extensive three dimensional codes become available, it would seem appropriate to incorporate these tools into a framework similar to CDUCT-LaRC and use them in a complementary manner. This work focuses on such an approach in beginning the steps toward a systematic assessment of the errors, and hence the trade-offs, involved in the use of these codes. To illustrate this point, CDUCT-LaRC was used to study benchmark hardwall duct problems to quantify errors caused by wave propagation in directions far removed from that defined by the parabolic approximation. Configurations incorporating acoustic treatment were also studied with CDUCT-LaRC and Q3D-FEM. The cases presented show that acoustic treatment diminishes the effects of CDUCT-LaRC phase error as the solutions are attenuated. The results of the Q3D-FEM were very promising and matched the analytic solution very well. Overall, these tests were meant to serve as a step toward the systematic study of errors inherent in the propagation module of CDUCT-LaRC, as well as an initial test of the higher fidelity Q3D-FEM code.

  9. Training camp: The quest to become a new National Institutes of Health (NIH)-funded independent investigator

    NASA Astrophysics Data System (ADS)

    Sklare, Daniel A.

    2003-04-01

    This presentation will provide information on the research training and career development programs of the National Institute on Deafness and Other Communication Disorders (NIDCD). The predoctoral and postdoctoral fellowship (F30, F31, F32) programs and the research career development awards for clinically trained individuals (K08/K23) and for individuals trained in the quantitative sciences and in engineering (K25) will be highlighted. In addition, the role of the NIDCD Small Grant (R03) in transitioning postdoctoral-level investigators to research independence will be underscored.

  10. Dimensionality of ICA in resting-state fMRI investigated by feature optimized classification of independent components with SVM.

    PubMed

    Wang, Yanlu; Li, Tie-Qiang

    2015-01-01

    Different machine learning algorithms have recently been used for assisting automated classification of independent component analysis (ICA) results from resting-state fMRI data. The success of this approach relies on identification of artifact components and meaningful functional networks. A limiting factor of ICA is the uncertainty of the number of independent components (NIC). We aim to develop a framework based on support vector machines (SVM) and optimized feature-selection for automated classification of independent components (ICs) and use the framework to investigate the effects of input NIC on the ICA results. Seven different resting-state fMRI datasets were studied. 18 features were devised by mimicking the empirical criteria for manual evaluation. The five most significant (p < 0.01) features were identified by general linear modeling and used to generate a classification model for the framework. This feature-optimized classification of ICs with SVM (FOCIS) framework was used to classify both group and single subject ICA results. The classification results obtained using FOCIS and previously published FSL-FIX were compared against manually evaluated results. On average the false negative rate in identifying artifact contaminated ICs for FOCIS and FSL-FIX were 98.27 and 92.34%, respectively. The number of artifact and functional network components increased almost linearly with the input NIC. Through tracking, we demonstrate that incrementing NIC affects most ICs when NIC < 33, whereas only a few limited ICs are affected by direct splitting when NIC is incremented beyond NIC > 40. For a given IC, its changes with increasing NIC are individually specific irrespective whether the component is a potential resting-state functional network or an artifact component. Using FOCIS, we investigated experimentally the ICA dimensionality of resting-state fMRI datasets and found that the input NIC can critically affect the ICA results of resting-state fMRI data.

  11. Dimensionality of ICA in resting-state fMRI investigated by feature optimized classification of independent components with SVM

    PubMed Central

    Wang, Yanlu; Li, Tie-Qiang

    2015-01-01

    Different machine learning algorithms have recently been used for assisting automated classification of independent component analysis (ICA) results from resting-state fMRI data. The success of this approach relies on identification of artifact components and meaningful functional networks. A limiting factor of ICA is the uncertainty of the number of independent components (NIC). We aim to develop a framework based on support vector machines (SVM) and optimized feature-selection for automated classification of independent components (ICs) and use the framework to investigate the effects of input NIC on the ICA results. Seven different resting-state fMRI datasets were studied. 18 features were devised by mimicking the empirical criteria for manual evaluation. The five most significant (p < 0.01) features were identified by general linear modeling and used to generate a classification model for the framework. This feature-optimized classification of ICs with SVM (FOCIS) framework was used to classify both group and single subject ICA results. The classification results obtained using FOCIS and previously published FSL-FIX were compared against manually evaluated results. On average the false negative rate in identifying artifact contaminated ICs for FOCIS and FSL-FIX were 98.27 and 92.34%, respectively. The number of artifact and functional network components increased almost linearly with the input NIC. Through tracking, we demonstrate that incrementing NIC affects most ICs when NIC < 33, whereas only a few limited ICs are affected by direct splitting when NIC is incremented beyond NIC > 40. For a given IC, its changes with increasing NIC are individually specific irrespective whether the component is a potential resting-state functional network or an artifact component. Using FOCIS, we investigated experimentally the ICA dimensionality of resting-state fMRI datasets and found that the input NIC can critically affect the ICA results of resting-state fMRI data. PMID

  12. Investigation of independence in inter-animal tumor-type occurrences within the NTP rodent-bioassay database

    SciTech Connect

    Bogen, K.T.; Seilkop, S.

    1993-05-01

    Statistically significant elevation in tumor incidence at multiple histologically distinct sites is occasionally observed among rodent bioassays of chemically induced carcinogenesis. If such data are to be relied on (as they have, e.g., by the US EPA) for quantitative cancer potency assessment, their proper analysis requires a knowledge of the extent to which multiple tumor-type occurrences are independent or uncorrelated within individual bioassay animals. Although difficult to assess in a statistically rigorous fashion, a few significant associations among tumor-type occurrences in rodent bioassays have been reported. However, no comprehensive studies of animal-specific tumor-type occurrences at death or sacrifice have been conducted using the extensive set of available NTP rodent-bioassay data, on which most cancer-potency assessment for environmental chemicals is currently based. This report presents the results of such an analysis conducted on behalf of the National Research Council`s Committee on Risk Assessment for Hazardous Air Pollutants. Tumor-type associations among individual animals were examined for {approximately}2500 to 3000 control and {approximately}200 to 600 treated animals using pathology data from 62 B6C3F1 mouse studies and 61 F/344N rat studies obtained from a readily available subset of the NTP carcinogenesis bioassay database. No evidence was found for any large correlation in either the onset probability or the prevalence-at-death or sacrifice of any tumor-type pair investigated in control and treated rats and niece, although a few of the small correlations present were statistically significant. Tumor-type occurrences were in most cases nearly independent, and departures from independence, where they did occur, were small. This finding is qualified in that tumor-type onset correlations were measured only indirectly, given the limited nature of the data analyzed.

  13. Investigations of high-speed optical transmission systems employing Absolute Added Correlative Coding (AACC)

    NASA Astrophysics Data System (ADS)

    Dong-Nhat, Nguyen; Elsherif, Mohamed A.; Malekmohammadi, Amin

    2016-07-01

    A novel multilevel modulation format based on partial-response signaling called Absolute Added Correlative Coding (AACC) is proposed and numerically demonstrated for high-speed fiber-optic communication systems. A bit error rate (BER) estimation model for the proposed multilevel format has also been developed. The performance of AACC is examined and compared against other prevailing On-Off-Keying and multilevel modulation formats e.g. non-return-to-zero (NRZ), 50% return-to-zero (RZ), 67% carrier-suppressed return-to-zero (CS-RZ), duobinary and four-level pulse-amplitude modulation (4-PAM) in terms of receiver sensitivity, spectral efficiency and dispersion tolerance. Calculated receiver sensitivity at a BER of 10-9 and chromatic dispersion tolerance of the proposed system are ∼-28.3 dBm and ∼336 ps/nm, respectively. The performance of AACC is delineated to be improved by 7.8 dB in terms of receiver sensitivity compared to 4-PAM in back-to-back scenario. The comparison results also show a clear advantage of AACC in achieving longer fiber transmission distance due to the higher dispersion tolerance in optical access networks.

  14. Investigations of high-speed optical transmission systems employing Absolute Added Correlative Coding (AACC)

    NASA Astrophysics Data System (ADS)

    Dong-Nhat, Nguyen; Elsherif, Mohamed A.; Malekmohammadi, Amin

    2016-07-01

    A novel multilevel modulation format based on partial-response signaling called Absolute Added Correlative Coding (AACC) is proposed and numerically demonstrated for high-speed fiber-optic communication systems. A bit error rate (BER) estimation model for the proposed multilevel format has also been developed. The performance of AACC is examined and compared against other prevailing On-Off-Keying and multilevel modulation formats e.g. non-return-to-zero (NRZ), 50% return-to-zero (RZ), 67% carrier-suppressed return-to-zero (CS-RZ), duobinary and four-level pulse-amplitude modulation (4-PAM) in terms of receiver sensitivity, spectral efficiency and dispersion tolerance. Calculated receiver sensitivity at a BER of 10-9 and chromatic dispersion tolerance of the proposed system are ˜-28.3 dBm and ˜336 ps/nm, respectively. The performance of AACC is delineated to be improved by 7.8 dB in terms of receiver sensitivity compared to 4-PAM in back-to-back scenario. The comparison results also show a clear advantage of AACC in achieving longer fiber transmission distance due to the higher dispersion tolerance in optical access networks.

  15. Theoretical models and simulation codes to investigate bystander effects and cellular communication at low doses

    NASA Astrophysics Data System (ADS)

    Ballarini, F.; Alloni, D.; Facoetti, A.; Mairani, A.; Nano, R.; Ottolenghi, A.

    Astronauts in space are continuously exposed to low doses of ionizing radiation from Galactic Cosmic Rays During the last ten years the effects of low radiation doses have been widely re-discussed following a large number of observations on the so-called non targeted effects in particular bystander effects The latter consist of induction of cytogenetic damage in cells not directly traversed by radiation most likely as a response to molecular messengers released by directly irradiated cells Bystander effects which are observed both for lethal endpoints e g clonogenic inactivation and apoptosis and for non-lethal ones e g mutations and neoplastic transformation tend to show non-linear dose responses This might have significant consequences in terms of low-dose risk which is generally calculated on the basis of the Linear No Threshold hypothesis Although the mechanisms underlying bystander effects are still largely unknown it is now clear that two types of cellular communication i e via gap junctions and or release of molecular messengers into the extracellular environment play a fundamental role Theoretical models and simulation codes can be of help in elucidating such mechanisms In the present paper we will review different available modelling approaches including one that is being developed at the University of Pavia The focus will be on the different assumptions adopted by the various authors and on the implications of such assumptions in terms of non-targeted radiobiological damage and more generally low-dose

  16. Investigating What Undergraduate Students Know About Science: Results from Complementary Strategies to Code Open-Ended Responses

    NASA Astrophysics Data System (ADS)

    Tijerino, K.; Buxner, S.; Impey, C.; CATS

    2013-04-01

    This paper presents new findings from an ongoing study of undergraduate student science literacy. Using data drawn from a 22 year project and over 11,000 student responses, we present how students' word usage in open-ended responses relates to what it means to study something scientifically. Analysis of students' responses show that they easily use words commonly associated with science, such as hypothesis, study, method, test, and experiment; but do these responses use scientific words knowledgeably? As with many multifaceted disciplines, demonstration of comprehension varies. This paper presents three different ways that student responses have been coded to investigate their understanding of science; 1) differentiating quality of a response with a coding scheme; 2) using word counting as an indicator of overall response strength; 3) responses are coded for quality of students' response. Building on previous research, comparison of science literacy and open-ended responses demonstrates that knowledge of science facts and vocabulary does not indicate a comprehension of the concepts behind these facts and vocabulary. This study employs quantitative and qualitative methods to systematically determine frequency and meaning of responses to standardized questions, and illustrates how students are able to demonstrate a knowledge of vocabulary. However, this knowledge is not indicative of conceptual understanding and poses important questions about how we assess students' understandings of science.

  17. National evaluation of the benefits and risks of greater structuring and coding of the electronic health record: exploratory qualitative investigation

    PubMed Central

    Morrison, Zoe; Fernando, Bernard; Kalra, Dipak; Cresswell, Kathrin; Sheikh, Aziz

    2014-01-01

    Objective We aimed to explore stakeholder views, attitudes, needs, and expectations regarding likely benefits and risks resulting from increased structuring and coding of clinical information within electronic health records (EHRs). Materials and methods Qualitative investigation in primary and secondary care and research settings throughout the UK. Data were derived from interviews, expert discussion groups, observations, and relevant documents. Participants (n=70) included patients, healthcare professionals, health service commissioners, policy makers, managers, administrators, systems developers, researchers, and academics. Results Four main themes arose from our data: variations in documentation practice; patient care benefits; secondary uses of information; and informing and involving patients. We observed a lack of guidelines, co-ordination, and dissemination of best practice relating to the design and use of information structures. While we identified immediate benefits for direct care and secondary analysis, many healthcare professionals did not see the relevance of structured and/or coded data to clinical practice. The potential for structured information to increase patient understanding of their diagnosis and treatment contrasted with concerns regarding the appropriateness of coded information for patients. Conclusions The design and development of EHRs requires the capture of narrative information to reflect patient/clinician communication and computable data for administration and research purposes. Increased structuring and/or coding of EHRs therefore offers both benefits and risks. Documentation standards within clinical guidelines are likely to encourage comprehensive, accurate processing of data. As data structures may impact upon clinician/patient interactions, new models of documentation may be necessary if EHRs are to be read and authored by patients. PMID:24186957

  18. Investigation into the flow field around a maneuvering submarine using a Reynolds-averaged Navier-Stokes code

    NASA Astrophysics Data System (ADS)

    Rhee, Bong

    The accurate and efficient prediction of hydrodynamic forces and moments on a maneuvering submarine has been achieved by investigating the flow physics involving the interaction of the vortical flow shed from the sail and the cross-flow boundary layer of the hull. In this investigation, a Reynolds-Averaged Navier-Stokes (RANS) computer code is used to simulate the most important physical effects related to maneuvering. It is applied to a generic axisymmetric body with the relatively simple case of the flow around an unappended hull at an angle of attack. After the code is validated for this simple case, it is validated for the case of a submarine with various appendages attached to the hull moving at an angle of drift. All six components of predicted forces and moments for various drift angles are compared with experimental data. Calculated pressure coefficients along the azimuthal angle are compared with experimental data and discussed to show the effect of the sail and the stern appendages. To understand the main flow features for a submarine in a straight flight, the RANS code is applied to simulate SUBOFF axisymmetric body at zero angle of attack in a straight-line basin. Pressure coefficient, skin friction coefficient, mean velocity components and the Reynolds shear stresses are compared with experimental data and discussed. The physical aspects of the interaction between the vortical flow shed by the sail and the cross-flow boundary layer on the hull are explained in greater detail. The understanding of this interaction is very important to characterize accurately the hydrodynamic behavior of a maneuvering submarine.

  19. Investigations on the sensitivity of the computer code TURBO-2D

    NASA Astrophysics Data System (ADS)

    Amon, B.

    1994-12-01

    The two-dimensional computer model TURBO-2D for the calculation of two-phase flow was used to calculate the cold injection of fuel into a model chamber. Investigations of the influence of the input parameter on its sensitivity relative to the obtained results were made. In addition to that calculations were performed and compared using experimental injection pressure data and corresponding averaged injection parameter.

  20. Investigation of wellbore cooling by circulation and fluid penetration into the formation using a wellbore thermal simulator computer code

    SciTech Connect

    Duda, L.E.

    1987-01-01

    The high temperatures of geothermal wells present severe problems for drilling, logging, and developing these reservoirs. Cooling the wellbore is perhaps the most common method to solve these problems. However, it is usually not clear what may be the most effective wellbore cooling mechanism for a given well. In this paper, wellbore cooling by the use of circulation or by fluid injection into the surrounding rock is investigated using a wellbore thermal simulator computer code. Short circulation times offer no prolonged cooling of the wellbore, but long circulation times (greater than ten or twenty days) greatly reduce the warming rate after shut-in. The dependence of the warming rate on the penetration distance of cooler temperatures into the rock formation (as by fluid injection) is investigated. Penetration distances of greater than 0.6 m appear to offer a substantial reduction in the warming rate. Several plots are shown which demonstrate these effects.

  1. Performance investigation of the pulse and Campbelling modes of a fission chamber using a Poisson pulse train simulation code

    NASA Astrophysics Data System (ADS)

    Elter, Zs.; Jammes, C.; Pázsit, I.; Pál, L.; Filliatre, P.

    2015-02-01

    The detectors of the neutron flux monitoring system of the foreseen French GEN-IV sodium-cooled fast reactor (SFR) will be high temperature fission chambers placed in the reactor vessel in the vicinity of the core. The operation of a fission chamber over a wide-range neutron flux will be feasible provided that the overlap of the applicability of its pulse and Campbelling operational modes is ensured. This paper addresses the question of the linearity of these two modes and it also presents our recent efforts to develop a specific code for the simulation of fission chamber pulse trains. Our developed simulation code is described and its overall verification is shown. An extensive quantitative investigation was performed to explore the applicability limits of these two standard modes. It was found that for short pulses the overlap between the pulse and Campbelling modes can be guaranteed if the standard deviation of the background noise is not higher than 5% of the pulse amplitude. It was also shown that the Campbelling mode is sensitive to parasitic noise, while the performance of the pulse mode is affected by the stochastic amplitude distributions.

  2. Investigation of Nuclear Data Libraries with TRIPOLI-4 Monte Carlo Code for Sodium-cooled Fast Reactors

    NASA Astrophysics Data System (ADS)

    Lee, Y.-K.; Brun, E.

    2014-04-01

    The Sodium-cooled fast neutron reactor ASTRID is currently under design and development in France. Traditional ECCO/ERANOS fast reactor code system used for ASTRID core design calculations relies on multi-group JEFF-3.1.1 data library. To gauge the use of ENDF/B-VII.0 and JEFF-3.1.1 nuclear data libraries in the fast reactor applications, two recent OECD/NEA computational benchmarks specified by Argonne National Laboratory were calculated. Using the continuous-energy TRIPOLI-4 Monte Carlo transport code, both ABR-1000 MWth MOX core and metallic (U-Pu) core were investigated. Under two different fast neutron spectra and two data libraries, ENDF/B-VII.0 and JEFF-3.1.1, reactivity impact studies were performed. Using JEFF-3.1.1 library under the BOEC (Beginning of equilibrium cycle) condition, high reactivity effects of 808 ± 17 pcm and 1208 ± 17 pcm were observed for ABR-1000 MOX core and metallic core respectively. To analyze the causes of these differences in reactivity, several TRIPOLI-4 runs using mixed data libraries feature allow us to identify the nuclides and the nuclear data accounting for the major part of the observed reactivity discrepancies.

  3. Condition Self-Management in Pediatric Spina Bifida: A Longitudinal Investigation of Medical Adherence, Responsibility-Sharing, and Independence Skills

    PubMed Central

    Psihogios, Alexandra M.; Kolbuck, Victoria

    2015-01-01

    Objective This study aimed to evaluate rates of medical adherence, responsibility, and independence skills across late childhood and adolescence in youth with spina bifida (SB) and to explore associations among these disease self-management variables. Method 111 youth with SB, their parents, and a health professional participated at two time points. Informants completed questionnaires regarding medical adherence, responsibility-sharing, and child independence skills. Results Youth gained more responsibility and independence skills across time, although adherence rates did not follow a similar trajectory. Increased child medical responsibility was related to poorer adherence, and father-reported independence skills were associated with increased child responsibility. Conclusions This study highlights medical domains that are the most difficult for families to manage (e.g., skin checks). Although youth appear to gain more autonomy across time, ongoing parental involvement in medical care may be necessary to achieve optimal adherence across adolescence. PMID:26002195

  4. Independent assessment of TRAC-PF1 (Version 7. 0), RELAP5/MOD1 (Cycle 14), and TRAC-BD1 (Version 12. 0) codes using separate-effects experiments

    SciTech Connect

    Saha, P; Jo, J H; Neymotin, L; Rohatgi, U S; Slovik, G C; Yuelys-Miksis, C

    1985-08-01

    This report presents the results of independent code assessment conducted at BNL. The TRAC-PF1 (Version 7.0) and RELAP5/MOD1 (Cycle 14) codes were assessed using the critical flow tests, level swell test, countercurrent flow limitation (CCFL) tests, post-CHF test, steam generator thermal performance tests, and natural circulation tests. TRAC-BD1 (Version 12.0) was applied only to the CCFL and post-CHF tests. The TRAC-PWR series of codes, i.e., TRAC-P1A, TRAC-PD2, and TRAC-PF1, have been gradually improved. However, TRAC-PF1 appears to need improvement in almost all categories of tests/phenomena attempted to BNL. Of the two codes, TRAC-PF1 and RELAP5/MOD1, the latter needs more improvement particularly in the areas of: CCFL, Level swell, CHF correlation and post-CHF heat transfer, and Numerical stability. For the CCFL and post-CHF tests, TRAC-BD1 provides the best overall results. However, the TRAC-BD1 interfacial shear package for the countercurrent annular flow regime needs further improvement for better prediction of CCFL phenomenon. 47 refs., 87 figs., 15 tabs.

  5. Detecting pop-out targets in contexts of varying homogeneity: investigating homogeneity coding with event-related brain potentials (ERPs).

    PubMed

    Schubö, Anna; Wykowska, Agnieszka; Müller, Hermann J

    2007-03-23

    Searching for a target among many distracting context elements might be an easy or a demanding task. Duncan and Humphreys (Duncan, J., Humphreys, G.W., 1989. Visual search and stimulus similarity. Psychol. Rev. 96, 433-458) showed that not only the target itself plays a role in the difficulty of target detection. Similarity among context elements and dissimilarity of target and context are two main factors also affecting search efficiency. Moreover, many studies have shown that search becomes particularly efficient with large set sizes and perfectly homogeneous context elements, presumably due to grouping processes involved in target-context segmentation. Especially N2p amplitude has been found to be modulated by the number of context elements and their homogeneity. The aim of the present study was to investigate the influence of context elements of different heterogeneities on search performance using event-related brain potentials (ERPs). Results showed that contexts with perfectly homogeneous elements were indeed special: they were most efficient in visual search and elicited a large N2p differential amplitude effect. Increasing context heterogeneity led to a decrease in search performance and a reduction in N2p differential amplitude. Reducing the number of context elements led to a marked performance decrease for random heterogeneous contexts but not for grouped heterogeneous contexts. Behavioral and N2p results delivered evidence (a) in favor of specific processing modes operating on different spatial scales (b) for the existence of homogeneity coding postulated by Duncan and Humphreys.

  6. Evidence for two independent lineages of Griffithsia (Ceramiaceae, Rhodophyta) based on plastid protein-coding psaA, psbA, and rbcL gene sequences.

    PubMed

    Yang, Eun Chan; Boo, Sung Min

    2004-05-01

    The ceramiaceous red algal genus Griffithsia has characteristic large vegetative cells visible to the unaided eye and thousands of nuclei in a single cell at maturity. Its members often occur intertidally along temperate to tropical coasts. Although previous morphological studies indicated that Griffithsia is subdivided into four groups, there is no molecular phylogeny for the genus. We present the multigene phylogeny of the genus based on plastid protein-coding psaA, psbA, and rbcL genes from ten samples of eight Griffithsia species, eight samples of five putative relatives, such as Anotrichium and Halurus, and three outgroup taxa. Saturation plots for each of the three datasets showed no evidence of saturation at any codon position. The partition homogeneity test indicated that none of the individual datasets resulted in significantly incongruent trees. All the analyses of individual and concatenated datasets separated Griffithsia into two well-defined lineages: Lineage 1 was composed of Griffithsia corallinoides, Griffithsia pacifica, and Griffithsia tomo-yamadae, while lineage 2 encompassed Griffithsia antarctica, Griffithsia japonica, Griffithsia teges, Griffithsia traversii, and Griffithsia sp. Our results support the monophyly of the four Anotrichium species and cast a question on the autonomy of Halurus. The monophyly of the tribe Griffithsieae is well resolved, although interrelationships among Griffithsia, Anotrichium, and Halurus were unclear. Our study indicates that the psaA and psbA genes are powerful new tools for the genus-level phylogeny of red algal groups, such as Griffithsia. This is the first report on the multigene phylogeny of the Ceramiales algae based on three protein-coding plastid genes.

  7. Investigation of fast ion behavior using orbit following Monte-Carlo code in magnetic perturbed field in KSTAR

    NASA Astrophysics Data System (ADS)

    Shinohara, Kouji; Suzuki, Yasuhiro; Kim, Junghee; Kim, Jun Young; Jeon, Young Mu; Bierwage, Andreas; Rhee, Tongnyeol

    2016-11-01

    The fast ion dynamics and the associated heat load on the plasma facing components in the KSTAR tokamak were investigated with the orbit following Monte-Carlo (OFMC) code in several magnetic field configurations and realistic wall geometry. In particular, attention was paid to the effect of resonant magnetic perturbation (RMP) fields. Both the vacuum field approximation as well as the self-consistent field that includes the response of a stationary plasma were considered. In both cases, the magnetic perturbation (MP) is dominated by the toroidal mode number n  =  1, but otherwise its structure is strongly affected by the plasma response. The loss of fast ions increased significantly when the MP field was applied. Most loss particles hit the poloidal limiter structure around the outer mid-plane on the low field side, but the distribution of heat loads across the three limiters varied with the form of the MP. Short-timescale loss of supposedly well-confined co-passing fast ions was also observed. These losses started within a few poloidal transits after the fast ion was born deep inside the plasma on the high-field side of the magnetic axis. In the configuration studied, these losses are facilitated by the combination of two factors: (i) the large magnetic drift of fast ions across a wide range of magnetic surfaces due to a low plasma current, and (ii) resonant interactions between the fast ions and magnetic islands that were induced inside the plasma by the external RMP field. These effects are expected to play an important role in present-day tokamaks.

  8. View-independent coding of face identity in frontal and temporal cortices is modulated by familiarity: an event-related fMRI study.

    PubMed

    Pourtois, Gilles; Schwartz, Sophie; Seghier, Mohamed L; Lazeyras, François; Vuilleumier, Patrik

    2005-02-15

    Face recognition is a unique visual skill enabling us to recognize a large number of person identities, despite many differences in the visual image from one exposure to another due to changes in viewpoint, illumination, or simply passage of time. Previous familiarity with a face may facilitate recognition when visual changes are important. Using event-related fMRI in 13 healthy observers, we studied the brain systems involved in extracting face identity independent of modifications in visual appearance during a repetition priming paradigm in which two different photographs of the same face (either famous or unfamiliar) were repeated at varying delays. We found that functionally defined face-selective areas in the lateral fusiform cortex showed no repetition effects for faces across changes in image views, irrespective of pre-existing familiarity, suggesting that face representations formed in this region do not generalize across different visual images, even for well-known faces. Repetition of different but easily recognizable views of an unfamiliar face produced selective repetition decreases in a medial portion of the right fusiform gyrus, whereas distinct views of a famous face produced repetition decreases in left middle temporal and left inferior frontal cortex selectively, but no decreases in fusiform cortex. These findings reveal that different views of the same familiar face may not be integrated within a single representation at initial perceptual stages subserved by the fusiform face areas, but rather involve later processing stages where more abstract identity information is accessed. PMID:15670699

  9. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  10. Evaluation of the rodent Hershberger bioassay on intact juvenile males--testing of coded chemicals and supplementary biochemical investigations.

    PubMed

    Freyberger, A; Schladt, L

    2009-08-01

    Under the auspices of the Organization for Economic Cooperation and Development (OECD) the Hershberger assay on juvenile intact male rats is being validated as a screen for compounds with anti-androgenic potential. We participated in the testing of coded chemicals. Compounds included the positive control flutamide (FLUT, 3 mg/kg), linuron (LIN, 10, 100 mg/kg), p,p'-DDE (16, 160 mg/kg), and two negative substances, 4-nonylphenol (NP, 160 mg/kg) and 2,4-dinitrophenol (DNP, 10 mg/kg). Compounds were administered for 10 consecutive days by gavage to testosterone propionate (TP, 1 mg/kgs.c.)-supplemented rats. Uncoding revealed these results: compared to vehicle controls, treatment with TP resulted in increased androgen-sensitive tissue (AST) weights of ventral prostate (VP), seminal vesicles (SV), levator ani and bulbocavernosus muscles (LABC), Cowper's glands, and epididymides, and in decreased testes weight. When assessing anti-androgenic potential in TP-supplemented rats, FLUT decreased all AST weights, and increased testes weight. p,p'-DDE at the high dose, decreased final body weight and all AST weights, whereas the low dose only affected SV weight. LIN slightly decreased final body weight and decreased absolute SV and LABC and relative SV weights only at the high dose. NP decreased final body weight and only absolute SV weights, DNP was ineffective. Investigations not requested by OECD included measurement of liver enzymes and revealed strong induction of testosterone-metabolizing and phase II conjugating enzymes by p,p'-DDE. Our findings suggest that in principle the juvenile intact male rat can be used in the Hershberger assay to screen for anti-androgenic potential thereby contributing to a refinement of the assay in terms of animal welfare. However, in our hands this animal model was somewhat less sensitive than the peripubertal castrated rat. Final conclusions, however, can only be drawn on the basis of all available validation data. Results obtained with

  11. Independent Technical Investigation of the Puna Geothermal Venture Unplanned Steam Release, June 12 and 13, 1991, Puna, Hawaii

    SciTech Connect

    Thomas, Richard; Whiting, Dick; Moore, James; Milner, Duey

    1991-07-01

    On June 24, 1991, a third-party investigation team consisting of Richard P. Thomas, Duey E. Milner, James L. Moore, and Dick Whiting began an investigation into the blowout of well KS-8, which occurred at the Puna Geothermal Venture (PGV) site on June 12, 1991, and caused the unabated release of steam for a period of 31 hours before PGV succeeded in closing in the well. The scope of the investigation was to: (a) determine the cause(s) of the incident; (b) evaluate the adequacy of PGVs drilling and blowout prevention equipment and procedures; and (c) make recommendations for any appropriate changes in equipment and/or procedures. This report finds that the blowout occurred because of inadequacies in PGVs drilling plan and procedures and not as a result of unusual or unmanageable subsurface geologic or hydrologic conditions. While the geothermal resource in the area being drilled is relatively hot, the temperatures are not excessive for modem technology and methods to control. Fluid pressures encountered are also manageable if proper procedures are followed and the appropriate equipment is utilized. A previous blowout of short duration occurred on February 21, 1991, at the KS-7 injection well being drilled by PGV at a depth of approximately 1600'. This unexpected incident alerted PGV to the possibility of encountering a high temperature, fractured zone at a relatively shallow depth. The experience at KS-7 prompted PGV to refine its hydrological model; however, the drilling plan utilized for KS-8 was not changed. Not only did PGV fail to modify its drilling program following the KS-7 blowout, but they also failed to heed numerous ''red flags'' (warning signals) in the five days preceding the KS-8 blowout, which included a continuous 1-inch flow of drilling mud out of the wellbore, gains in mud volume while pulling stands, and gas entries while circulating muds bottoms up, in addition to lost circulation that had occurred earlier below the shoe of the 13-3/8-hch casing.

  12. Supporting the Cybercrime Investigation Process: Effective Discrimination of Source Code Authors Based on Byte-Level Information

    NASA Astrophysics Data System (ADS)

    Frantzeskou, Georgia; Stamatatos, Efstathios; Gritzalis, Stefanos

    Source code authorship analysis is the particular field that attempts to identify the author of a computer program by treating each program as a linguistically analyzable entity. This is usually based on other undisputed program samples from the same author. There are several cases where the application of such a method could be of a major benefit, such as tracing the source of code left in the system after a cyber attack, authorship disputes, proof of authorship in court, etc. In this paper, we present our approach which is based on byte-level n-gram profiles and is an extension of a method that has been successfully applied to natural language text authorship attribution. We propose a simplified profile and a new similarity measure which is less complicated than the algorithm followed in text authorship attribution and it seems more suitable for source code identification since is better able to deal with very small training sets. Experiments were performed on two different data sets, one with programs written in C++ and the second with programs written in Java. Unlike the traditional language-dependent metrics used by previous studies, our approach can be applied to any programming language with no additional cost. The presented accuracy rates are much better than the best reported results for the same data sets.

  13. The disposition of impact ejecta resulting from the AIDA-DART mission to binary asteroid 65803 Didymos: an independent investigation

    NASA Astrophysics Data System (ADS)

    Richardson, James E.; O'Brien, David P.

    2016-10-01

    If all goes as planned, in the year 2020 a joint ESA and NASA mission will be launched that will rendezvous with the near-Earth binary asteroid system 65803 Didymos in the fall of 2022. The European component, the Asteroid Impact & Deflection Assessment (AIDA) spacecraft will arrive first and characterize the system, which consists of a ~800 m diameter primary and a ~160 m diameter secondary, orbiting a common center of mass at a semi-major axis distance of ~1200 m with a orbital period of 11.9 hr. Following system characterization, the AIDA spacecraft will remove to a safe distance while the NASA component, the 300 kg Double Asteroid Redirection Test (DART) spacecraft collides with the trailing edge of the secondary body (with respect to the binary's retrograde mutual orbit). Meanwhile, the AIDA spacecraft will conduct observations of this impact and its aftermath, specifically looking for changes made to the primary, the secondary, and their mutual orbit as a result of the DART collision. Of particular interest is the ballistic flight and final disposition of the ejecta produced by the impact cratering process, not just from the standpoint of scientific study, but also from the standpoint of AIDA spacecraft safety.In this study, we investigate a series of hypothetical DART impacts utilizing a semi-empirical, numerical impact ejecta plume model originally developed for the Deep Impact mission and designed specifically with impacts on small bodies in mind. The resulting excavated mass is discretized into 7200 individual tracer particles, each representing a unique combination of speed, mass, and ejected direction. The trajectory of each tracer is computed numerically under the gravitational influence of both primary and secondary, along with the effects of solar radiation pressure. Each tracer is followed until it either impacts a body or escapes the system, whereupon tracking is continued in the heliocentric frame using an N-body integrator. Various impact

  14. Independence Is.

    ERIC Educational Resources Information Center

    Stickney, Sharon

    This workbook is designed to help participants of the Independence Training Program (ITP) to achieve a definition of "independence." The program was developed for teenage girls. The process for developing the concept of independence consists of four steps. Step one instructs the participant to create an imaginary situation where she is completely…

  15. Investigation of island formation due to RMPs in DIII-D plasmas with the SIESTA resistive MHD equilibrium code

    NASA Astrophysics Data System (ADS)

    Hirshman, S. P.; Shafer, M. W.; Seal, S. K.; Canik, J. M.

    2016-04-01

    > The SIESTA magnetohydrodynamic (MHD) equilibrium code has been used to compute a sequence of ideally stable equilibria resulting from numerical variation of the helical resonant magnetic perturbation (RMP) applied to an axisymmetric DIII-D plasma equilibrium. Increasing the perturbation strength at the dominant , resonant surface leads to lower MHD energies and increases in the equilibrium island widths at the (and sidebands) surfaces, in agreement with theoretical expectations. Island overlap at large perturbation strengths leads to stochastic magnetic fields which correlate well with the experimentally inferred field structure. The magnitude and spatial phase (around the dominant rational surfaces) of the resonant (shielding) component of the parallel current are shown to change qualitatively with the magnetic island topology.

  16. Investigation of island formation due to RMPs in DIII-D plasmas with the SIESTA resistive MHD equilibrium code

    DOE PAGES

    Hirshman, S. P.; Shafer, M. W.; Seal, S. K.; Canik, J. M.

    2016-03-03

    The SIESTA magnetohydrodynamic (MHD) equilibrium code has been used to compute a sequence of ideally stable equilibria resulting from numerical variation of the helical resonant magnetic perturbation (RMP) applied to an axisymmetric DIII-D plasma equilibrium. Increasing the perturbation strength at the dominant $m=2$,$n=-1$ resonant surface leads to lower MHD energies and increases in the equilibrium island widths at the $m=2$ (and sidebands) surfaces, in agreement with theoretical expectations. Island overlap at large perturbation strengths leads to stochastic magnetic fields which correlate well with the experimentally inferred field structure. The magnitude and spatial phase (around the dominant rational surfaces) of themore » resonant (shielding) component of the parallel current are shown to change qualitatively with the magnetic island topology.« less

  17. Design and performance investigation of LDPC-coded upstream transmission systems in IM/DD OFDM-PONs

    NASA Astrophysics Data System (ADS)

    Gong, Xiaoxue; Guo, Lei; Wu, Jingjing; Ning, Zhaolong

    2016-12-01

    In Intensity-Modulation Direct-Detection (IM/DD) Orthogonal Frequency Division Multiplexing Passive Optical Networks (OFDM-PONs), aside from Subcarrier-to-Subcarrier Intermixing Interferences (SSII) induced by square-law detection, the same laser frequency for data sending from Optical Network Units (ONUs) results in ONU-to-ONU Beating Interferences (OOBI) at the receiver. To mitigate those interferences, we design a Low-Density Parity Check (LDPC)-coded and spectrum-efficient upstream transmission system. A theoretical channel model is also derived, in order to analyze the detrimental factors influencing system performances. Simulation results demonstrate that the receiver sensitivity is improved 3.4 dB and 2.5 dB under QPSK and 8QAM, respectively, after 100 km Standard Single-Mode Fiber (SSMF) transmission. Furthermore, the spectrum efficiency can be improved by about 50%.

  18. Investigation of the N-terminal coding region of MUC7 alterations in dentistry students with and without caries

    PubMed Central

    Koç Öztürk, L; Yarat, A; Akyuz, S; Furuncuoglu, H

    2016-01-01

    ABSTRACT Human low-molecular weight salivary mucin (MUC7) is a small, secreted glycoprotein coded by MUC7. In the oral cavity, they inhibit the colonization of oral bacteria, including cariogenic ones, by masking their surface adhesions, thus helping saliva to avoid dental caries. The N-terminal domain is important for low-molecular weight (MG2) mucins to contact with oral microorganisms. In this study, we aimed to identify the N-terminal coding region of the MUC7 gene between individuals with and without caries. Forty-four healthy dental students were enrolled in this study; 24 of them were classified to have caries [decayed, missing, filled-teeth (DMFT) = 5.6] according to the World Health Organization (WHO) criteria, and 20 of them were caries-free (DMFT = 0). Simplified oral hygiene index (OHI-S) and gingival index (GI) were used to determine the oral hygiene and gingival conditions. Total protein levels and salivary total protein levels and salivary buffer capacity (SBC) were determined by Lowry and Ericsson methods. DNA was extracted from peripheral blood cells of all the participants and genotyping was carried out by a polymerase chain reaction (PCR)-sequencing method. No statistical differences were found between two groups in the terms of salivary parameters, oral hygiene and gingival conditions. We detected one common single nucleotide polymorphism (SNP) that leads to a change of asparagine to lysine at codon 80. This substitution was found in 29.0 and 40.0%, respectively, of the groups with and without caries. No other sequence variations were detected. The SNP found in this study may be a specific polymorphism affecting the Turkish population. Further studies with extended numbers are necessary in order to clarify this finding.

  19. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  20. MORSE Monte Carlo code

    SciTech Connect

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.

  1. Are Independent Probes Truly Independent?

    ERIC Educational Resources Information Center

    Camp, Gino; Pecher, Diane; Schmidt, Henk G.; Zeelenberg, Rene

    2009-01-01

    The independent cue technique has been developed to test traditional interference theories against inhibition theories of forgetting. In the present study, the authors tested the critical criterion for the independence of independent cues: Studied cues not presented during test (and unrelated to test cues) should not contribute to the retrieval…

  2. Investigating mitochondrial metabolism in contracting HL-1 cardiomyocytes following hypoxia and pharmacological HIF activation identifies HIF-dependent and independent mechanisms of regulation.

    PubMed

    Ambrose, Lucy J A; Abd-Jamil, Amira H; Gomes, Renata S M; Carter, Emma E; Carr, Carolyn A; Clarke, Kieran; Heather, Lisa C

    2014-11-01

    Hypoxia is a consequence of cardiac disease and downregulates mitochondrial metabolism, yet the molecular mechanisms through which this occurs in the heart are incompletely characterized. Therefore, we aimed to use a contracting HL-1 cardiomyocyte model to investigate the effects of hypoxia on mitochondrial metabolism. Cells were exposed to hypoxia (2% O2) for 6, 12, 24, and 48 hours to characterize the metabolic response. Cells were subsequently treated with the hypoxia inducible factor (HIF)-activating compound, dimethyloxalylglycine (DMOG), to determine whether hypoxia-induced mitochondrial changes were HIF dependent or independent, and to assess the suitability of this cultured cardiac cell line for cardiovascular pharmacological studies. Hypoxic cells had increased glycolysis after 24 hours, with glucose transporter 1 and lactate levels increased 5-fold and 15-fold, respectively. After 24 hours of hypoxia, mitochondrial networks were more fragmented but there was no change in citrate synthase activity, indicating that mitochondrial content was unchanged. Cellular oxygen consumption was 30% lower, accompanied by decreases in the enzymatic activities of electron transport chain (ETC) complexes I and IV, and aconitase by 81%, 96%, and 72%, relative to controls. Pharmacological HIF activation with DMOG decreased cellular oxygen consumption by 43%, coincident with decreases in the activities of aconitase and complex I by 26% and 30%, indicating that these adaptations were HIF mediated. In contrast, the hypoxia-mediated decrease in complex IV activity was not replicated by DMOG treatment, suggesting HIF-independent regulation of this complex. In conclusion, 24 hours of hypoxia increased anaerobic glycolysis and decreased mitochondrial respiration, which was associated with changes in ETC and tricarboxylic acid cycle enzyme activities in contracting HL-1 cells. Pharmacological HIF activation in this cardiac cell line allowed both HIF-dependent and independent

  3. Multigenerational Independent Colony for Extraterrestrial Habitation, Autonomy, and Behavior Health (MICEHAB): An Investigation of a Long Duration, Partial Gravity, Autonomous Rodent Colony

    NASA Technical Reports Server (NTRS)

    Rodgers, Erica M.; Simon, Matthew A.; Antol, Jeffrey; Chai, Patrick R.; Jones, Christopher A.; Klovstad, Jordan J.; Neilan, James H.; Stillwagen, Frederic H.; Williams, Phillip A.; Bednara, Michael; Guendel, Alex; Hernandez, Joel; Lewis, Weston; Lim, Jeremy; Wilson, Logan; Wusk, Grace

    2015-01-01

    The path from Earth to Mars requires exploration missions to be increasingly Earth-independent as the foundation is laid for a sustained human presence in the following decades. NASA pioneering of Mars will expand the boundaries of human exploration, as a sustainable presence on the surface requires humans to successfully reproduce in a partial gravity environment independent from Earth intervention. Before significant investment is made in capabilities leading to such pioneering efforts, the challenges of multigenerational mammalian reproduction in a partial gravity environment need be investigated. The Multi-generational Independent Colony for Extraterrestrial Habitation, Autonomy, and Behavior health is designed to study these challenges. The proposed concept is a conceptual, long duration, autonomous habitat designed to house rodents in a partial gravity environment with the goal of understanding the effects of partial gravity on mammalian reproduction over multiple generations and how to effectively design such a facility to operate autonomously while keeping the rodents healthy in order to achieve multiple generations. All systems are designed to feed forward directly to full-scale human missions to Mars. This paper presents the baseline design concept formulated after considering challenges in the mission and vehicle architectures such as: vehicle automation, automated crew health management/medical care, unique automated waste disposal and hygiene, handling of deceased crew members, reliable long-duration crew support systems, and radiation protection. This concept was selected from an architectural trade space considering the balance between mission science return and robotic and autonomy capabilities. The baseline design is described in detail including: transportation and facility operation constraints, artificial gravity system design, habitat design, and a full-scale mock-up demonstration of autonomous rodent care facilities. The proposed concept has

  4. A gene-environment investigation on personality traits in two independent clinical sets of adult patients with personality disorder and attention deficit/hyperactive disorder.

    PubMed

    Jacob, Christian P; Nguyen, Thuy Trang; Dempfle, Astrid; Heine, Monika; Windemuth-Kieselbach, Christine; Baumann, Katarina; Jacob, Florian; Prechtl, Julian; Wittlich, Maike; Herrmann, Martin J; Gross-Lesch, Silke; Lesch, Klaus-Peter; Reif, Andreas

    2010-06-01

    While an interactive effect of genes with adverse life events is increasingly appreciated in current concepts of depression etiology, no data are presently available on interactions between genetic and environmental (G x E) factors with respect to personality and related disorders. The present study therefore aimed to detect main effects as well as interactions of serotonergic candidate genes (coding for the serotonin transporter, 5-HTT; the serotonin autoreceptor, HTR1A; and the enzyme which synthesizes serotonin in the brain, TPH2) with the burden of life events (#LE) in two independent samples consisting of 183 patients suffering from personality disorders and 123 patients suffering from adult attention deficit/hyperactivity disorder (aADHD). Simple analyses ignoring possible G x E interactions revealed no evidence for associations of either #LE or of the considered polymorphisms in 5-HTT and TPH2. Only the G allele of HTR1A rs6295 seemed to increase the risk of emotional-dramatic cluster B personality disorders (p = 0.019, in the personality disorder sample) and to decrease the risk of anxious-fearful cluster C personality disorders (p = 0.016, in the aADHD sample). We extended the initial simple model by taking a G x E interaction term into account, since this approach may better fit the data indicating that the effect of a gene is modified by stressful life events or, vice versa, that stressful life events only have an effect in the presence of a susceptibility genotype. By doing so, we observed nominal evidence for G x E effects as well as main effects of 5-HTT-LPR and the TPH2 SNP rs4570625 on the occurrence of personality disorders. Further replication studies, however, are necessary to validate the apparent complexity of G x E interactions in disorders of human personality.

  5. An investigation for population maintenance mechanism in a miniature garden: genetic connectivity or independence of small islet populations of the Ryukyu five-lined skink.

    PubMed

    Kurita, Kazuki; Hikida, Tsutomu; Toda, Mamoru

    2014-01-01

    The Ryukyu five-lined skink (Plestiodon marginatus) is an island lizard that is even found in tiny islets with less than half a hectare of habitat area. We hypothesized that the island populations are maintained under frequent gene flow among the islands or independent of each other. To test our hypotheses, we investigated genetic structure of 21 populations from 11 land-bridge islands that were connected during the latest glacial age, and 4 isolated islands. Analyses using mitochondrial cytochrome b gene sequence (n = 67) and 10 microsatellite loci (n = 235) revealed moderate to high levels of genetic differentiation, existence of many private alleles/haplotypes in most islands, little contemporary migration, a positive correlation between genetic variability and island area, and a negative correlation between relatedness and island area. These evidences suggest a strong effect of independent genetic drift as opposed to gene flow, favoring the isolation hypothesis even in tiny islet populations. Isolation-by-distance effect was demonstrated and it became more prominent when the 4 isolated islands were excluded, suggesting that the pattern is a remnant of the land-bridge age. In a few island populations, however, the possibility of occasional overwater dispersals was partially supported and therefore could not be ruled out. PMID:25189776

  6. Microbial diversity and dynamics throughout manufacturing and ripening of surface ripened semi-hard Danish Danbo cheeses investigated by culture-independent techniques.

    PubMed

    Ryssel, Mia; Johansen, Pernille; Al-Soud, Waleed Abu; Sørensen, Søren; Arneborg, Nils; Jespersen, Lene

    2015-12-23

    Microbial successions on the surface and in the interior of surface ripened semi-hard Danish Danbo cheeses were investigated by culture-dependent and -independent techniques. Culture-independent detection of microorganisms was obtained by denaturing gradient gel electrophoresis (DGGE) and pyrosequencing, using amplicons of 16S and 26S rRNA genes for prokaryotes and eukaryotes, respectively. With minor exceptions, the results from the culture-independent analyses correlated to the culture-dependent plating results. Even though the predominant microorganisms detected with the two culture-independent techniques correlated, a higher number of genera were detected by pyrosequencing compared to DGGE. Additionally, minor parts of the microbiota, i.e. comprising <10.0% of the operational taxonomic units (OTUs), were detected by pyrosequencing, resulting in more detailed information on the microbial succession. As expected, microbial profiles of the surface and the interior of the cheeses diverged. During cheese production pyrosequencing determined Lactococcus as the dominating genus on cheese surfaces, representing on average 94.7%±2.1% of the OTUs. At day 6 Lactococcus spp. declined to 10.0% of the OTUs, whereas Staphylococcus spp. went from 0.0% during cheese production to 75.5% of the OTUs at smearing. During ripening, i.e. from 4 to 18 weeks, Corynebacterium was the dominant genus on the cheese surface (55.1%±9.8% of the OTUs), with Staphylococcus (17.9%±11.2% of the OTUs) and Brevibacterium (10.4%±8.3% of the OTUs) being the second and third most abundant genera. Other detected bacterial genera included Clostridiisalibacter (5.0%±4.0% of the OTUs), as well as Pseudoclavibacter, Alkalibacterium and Marinilactibacillus, which represented <2% of the OTUs. At smearing, yeast counts were low with Debaryomyces being the dominant genus accounting for 46.5% of the OTUs. During ripening the yeast counts increased significantly with Debaryomyces being the predominant genus

  7. Microbial diversity and dynamics throughout manufacturing and ripening of surface ripened semi-hard Danish Danbo cheeses investigated by culture-independent techniques.

    PubMed

    Ryssel, Mia; Johansen, Pernille; Al-Soud, Waleed Abu; Sørensen, Søren; Arneborg, Nils; Jespersen, Lene

    2015-12-23

    Microbial successions on the surface and in the interior of surface ripened semi-hard Danish Danbo cheeses were investigated by culture-dependent and -independent techniques. Culture-independent detection of microorganisms was obtained by denaturing gradient gel electrophoresis (DGGE) and pyrosequencing, using amplicons of 16S and 26S rRNA genes for prokaryotes and eukaryotes, respectively. With minor exceptions, the results from the culture-independent analyses correlated to the culture-dependent plating results. Even though the predominant microorganisms detected with the two culture-independent techniques correlated, a higher number of genera were detected by pyrosequencing compared to DGGE. Additionally, minor parts of the microbiota, i.e. comprising <10.0% of the operational taxonomic units (OTUs), were detected by pyrosequencing, resulting in more detailed information on the microbial succession. As expected, microbial profiles of the surface and the interior of the cheeses diverged. During cheese production pyrosequencing determined Lactococcus as the dominating genus on cheese surfaces, representing on average 94.7%±2.1% of the OTUs. At day 6 Lactococcus spp. declined to 10.0% of the OTUs, whereas Staphylococcus spp. went from 0.0% during cheese production to 75.5% of the OTUs at smearing. During ripening, i.e. from 4 to 18 weeks, Corynebacterium was the dominant genus on the cheese surface (55.1%±9.8% of the OTUs), with Staphylococcus (17.9%±11.2% of the OTUs) and Brevibacterium (10.4%±8.3% of the OTUs) being the second and third most abundant genera. Other detected bacterial genera included Clostridiisalibacter (5.0%±4.0% of the OTUs), as well as Pseudoclavibacter, Alkalibacterium and Marinilactibacillus, which represented <2% of the OTUs. At smearing, yeast counts were low with Debaryomyces being the dominant genus accounting for 46.5% of the OTUs. During ripening the yeast counts increased significantly with Debaryomyces being the predominant genus

  8. Investigations to determine whether Section XI of the ASME (American Society of Mechanical Engineers) Boiler and Pressure Vessel Code should include PLEX (plant life extension) baseline inspection guidance

    SciTech Connect

    Bustard, L.D.

    1988-01-01

    A plant life extension (PLEX) issue repeatedly mentioned is whether special PLEX supplemental inspection requirements should be added to Section XI of the ASME Boiler and Pressure Vessel Code. To assist the ASME answer this question, the DOE Technology Management Center performed an industry survey to assess whether there was a technical consensus regarding the desirability and scope of a supplemental PLEX baseline inspection. This survey demonstrated the lack of an initial industry consensus. In response to the survey results, ASME has formed a task group to investigate various PLEX supplemental inspection strategies and to assess their value and liabilities. The results of the survey and initial task group activities are reviewed.

  9. Environmental health and safety independent investigation of the in situ vitrification melt expulsion at the Oak Ridge National Laboratory, Oak Ridge, Tennessee

    SciTech Connect

    1996-07-01

    At about 6:12 pm, EDT on April 21, 1996, steam and molten material were expelled from Pit 1 in situ vitrification (ISV) project at the Oak Ridge National Laboratory (ORNL). At the request of the director of the Environmental Restoration (ER) Division, Department of Energy Oak Ridge Operations (DOE ORO), an independent investigation team was established on April 26, 1996. This team was tasked to determine the facts related to the ORNL Pit 1 melt expulsion event (MEE) in the areas of environment safety and health concerns such as the adequacy of the ISV safety systems; operational control restrictions; emergency response planning/execution; and readiness review, and report the investigation team findings within 45 days from the date of incident. These requirements were stated in the letter of appointment presented in Appendix A of this report. This investigation did not address the physical causes of the MEE. A separate investigation was conducted by ISV project personnel to determine the causes of the melt expulsion and the extent of the effects of this phenomenon. In response to this event, occurrence report ORO-LMES-X10ENVRES-1996-0006 (Appendix B) was filed. The investigation team did not address the occurrence reporting or event notification process. The project personnel (project team) examined the physical evidence at Pit 1 ISV site (e.g., the ejected melt material and the ISV hood), reviewed documents such as the site- specific health and safety plan (HASP), and interviewed personnel involved in the event and/or the project. A listing of the personnel interviewed and evidence reviewed is provided in Appendix C.

  10. Extension of the supercritical carbon dioxide brayton cycle to low reactor power operation: investigations using the coupled anl plant dynamics code-SAS4A/SASSYS-1 liquid metal reactor code system.

    SciTech Connect

    Moisseytsev, A.; Sienicki, J. J.

    2012-05-10

    Significant progress has been made on the development of a control strategy for the supercritical carbon dioxide (S-CO{sub 2}) Brayton cycle enabling removal of power from an autonomous load following Sodium-Cooled Fast Reactor (SFR) down to decay heat levels such that the S-CO{sub 2} cycle can be used to cool the reactor until decay heat can be removed by the normal shutdown heat removal system or a passive decay heat removal system such as Direct Reactor Auxiliary Cooling System (DRACS) loops with DRACS in-vessel heat exchangers. This capability of the new control strategy eliminates the need for use of a separate shutdown heat removal system which might also use supercritical CO{sub 2}. It has been found that this capability can be achieved by introducing a new control mechanism involving shaft speed control for the common shaft joining the turbine and two compressors following reduction of the load demand from the electrical grid to zero. Following disconnection of the generator from the electrical grid, heat is removed from the intermediate sodium circuit through the sodium-to-CO{sub 2} heat exchanger, the turbine solely drives the two compressors, and heat is rejected from the cycle through the CO{sub 2}-to-water cooler. To investigate the effectiveness of shaft speed control, calculations are carried out using the coupled Plant Dynamics Code-SAS4A/SASSYS-1 code for a linear load reduction transient for a 1000 MWt metallic-fueled SFR with autonomous load following. No deliberate motion of control rods or adjustment of sodium pump speeds is assumed to take place. It is assumed that the S-CO{sub 2} turbomachinery shaft speed linearly decreases from 100 to 20% nominal following reduction of grid load to zero. The reactor power is calculated to autonomously decrease down to 3% nominal providing a lengthy window in time for the switchover to the normal shutdown heat removal system or for a passive decay heat removal system to become effective. However, the

  11. System and method for investigating sub-surface features of a rock formation with acoustic sources generating coded signals

    SciTech Connect

    Vu, Cung Khac; Nihei, Kurt; Johnson, Paul A; Guyer, Robert; Ten Cate, James A; Le Bas, Pierre-Yves; Larmat, Carene S

    2014-12-30

    A system and a method for investigating rock formations includes generating, by a first acoustic source, a first acoustic signal comprising a first plurality of pulses, each pulse including a first modulated signal at a central frequency; and generating, by a second acoustic source, a second acoustic signal comprising a second plurality of pulses. A receiver arranged within the borehole receives a detected signal including a signal being generated by a non-linear mixing process from the first-and-second acoustic signal in a non-linear mixing zone within the intersection volume. The method also includes-processing the received signal to extract the signal generated by the non-linear mixing process over noise or over signals generated by a linear interaction process, or both.

  12. Salam's independence

    NASA Astrophysics Data System (ADS)

    Fraser, Gordon

    2009-01-01

    In his kind review of my biography of the Nobel laureate Abdus Salam (December 2008 pp45-46), John W Moffat wrongly claims that Salam had "independently thought of the idea of parity violation in weak interactions".

  13. Maintaining Independence.

    ERIC Educational Resources Information Center

    Upah-Bant, Marilyn

    1978-01-01

    Describes the over-all business and production operation of the "Daily Illini" at the University of Illinois to show how this college publication has assumed the burdens and responsibilities of true independence. (GW)

  14. Independence of Internal Auditors.

    ERIC Educational Resources Information Center

    Montondon, Lucille; Meixner, Wilda F.

    1993-01-01

    A survey of 288 college and university auditors investigated patterns in their appointment, reporting, and supervisory practices as indicators of independence and objectivity. Results indicate a weakness in the positioning of internal auditing within institutions, possibly compromising auditor independence. Because the auditing function is…

  15. Investigating the role of rare coding variability in Mendelian dementia genes (APP, PSEN1, PSEN2, GRN, MAPT, and PRNP) in late-onset Alzheimer's disease

    PubMed Central

    Sassi, Celeste; Guerreiro, Rita; Gibbs, Raphael; Ding, Jinhui; Lupton, Michelle K.; Troakes, Claire; Al-Sarraj, Safa; Niblock, Michael; Gallo, Jean-Marc; Adnan, Jihad; Killick, Richard; Brown, Kristelle S.; Medway, Christopher; Lord, Jenny; Turton, James; Bras, Jose; Morgan, Kevin; Powell, John F.; Singleton, Andrew; Hardy, John

    2014-01-01

    The overlapping clinical and neuropathologic features between late-onset apparently sporadic Alzheimer's disease (LOAD), familial Alzheimer's disease (FAD), and other neurodegenerative dementias (frontotemporal dementia, corticobasal degeneration, progressive supranuclear palsy, and Creutzfeldt-Jakob disease) raise the question of whether shared genetic risk factors may explain the similar phenotype among these disparate disorders. To investigate this intriguing hypothesis, we analyzed rare coding variability in 6 Mendelian dementia genes (APP, PSEN1, PSEN2, GRN, MAPT, and PRNP), in 141 LOAD patients and 179 elderly controls, neuropathologically proven, from the UK. In our cohort, 14 LOAD cases (10%) and 11 controls (6%) carry at least 1 rare variant in the genes studied. We report a novel variant in PSEN1 (p.I168T) and a rare variant in PSEN2 (p.A237V), absent in controls and both likely pathogenic. Our findings support previous studies, suggesting that (1) rare coding variability in PSEN1 and PSEN2 may influence the susceptibility for LOAD and (2) GRN, MAPT, and PRNP are not major contributors to LOAD. Thus, genetic screening is pivotal for the clinical differential diagnosis of these neurodegenerative dementias. PMID:25104557

  16. Prognostic investigations of B7-H1 and B7-H4 expression levels as independent predictor markers of renal cell carcinoma.

    PubMed

    Safaei, Hamid Reza; Rostamzadeh, Ayoob; Rahmani, Omid; Mohammadi, Mohsen; Ghaderi, Omar; Yahaghi, Hamid; Ahmadi, Koroosh

    2016-06-01

    In order to evaluate the correlation of B7-H4 and B7-H1 with renal cell carcinoma (RCC), we analyzed B7-H1 and B7-H4 expressions and their clinical significance by immunohistochemical method. Our result indicated that B7-H4-positive staining was detected in 58.13 % of RCC tissues (25 tissues tumors), and there were 18 tissues of patients without detectable B7-H4. Furthermore, 21 cases (48.83 %) were B7-H1-positive. Positive tumor expressions of B7-H4 and B7-H1 were markedly related to advanced TNM stage (P = 0.001; P = 0.014), high grade (P = 0.001; P = 002), and larger tumor size (P = 0.002; P = 024) in RCC tissues than patients with B7-H4-negative and B7-H1-negative in RCC tissues. The patients with B7-H1 and B7-H4-positive expressions were found to be markedly correlated with the overall survival of the patients (P < 0.05) and tended to have an increased risk of death when compared with negative expression groups. Univariate analysis showed that B7-H4 and B7-H1 expressions, TNM stage, high grade, and tumor size were significantly related to the prognosis of RCC. Furthermore, multivariate analysis showed that B7-H4 and B7-H1 expressions decreased overall survival. The adjusted HR for B7-H1 was 2.83 (95 % CI 1.210-2.971; P = 0.031) and also was 2.918 (95 % CI 1.243-3.102; P = 0.006) for B7-H4 that showed these markers were independent prognostic factors in RCC patients. The expressions of B7-H1 and B7-H4 in RCC patients indicate that these markers may be as a predictor of tumor development and death risk. Further investigations can be helpful to confirm B7-H1 and B7-H4 roles as an independent predictor of clinical RCC outcome. PMID:26687644

  17. Independent Living.

    ERIC Educational Resources Information Center

    Nathanson, Jeanne H., Ed.

    1994-01-01

    This issue of "OSERS" addresses the subject of independent living of individuals with disabilities. The issue includes a message from Judith E. Heumann, the Assistant Secretary of the Office of Special Education and Rehabilitative Services (OSERS), and 10 papers. Papers have the following titles and authors: "Changes in the Rehabilitation Act of…

  18. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  19. A QM/MM investigation of the catalytic mechanism of metal-ion-independent core 2 β1,6-N-acetylglucosaminyltransferase.

    PubMed

    Tvaroška, Igor; Kozmon, Stanislav; Wimmerová, Michaela; Koča, Jaroslav

    2013-06-17

    β1,6-GlcNAc-transferase (C2GnT) is an important controlling factor of biological functions for many glycoproteins and its activity has been found to be altered in breast, colon, and lung cancer cells, in leukemia cells, in the lymhomonocytes of multiple sclerosis patients, leukocytes from diabetes patients, and in conditions causing an immune deficiency. The result of the action of C2GnT is the core 2 structure that is essential for the further elongation of the carbohydrate chains of O-glycans. The catalytic mechanism of this metal-ion-independent glycosyltransferase is of paramount importance and is investigated here by using quantum mechanical (QM) (density functional theory (DFT))/molecular modeling (MM) methods with different levels of theory. The structural model of the reaction site used in this report is based on the crystal structures of C2GnT. The entire enzyme-substrate system was subdivided into two different subsystems: the QM subsystem containing 206 atoms and the MM region containing 5914 atoms. Three predefined reaction coordinates were employed to investigate the catalytic mechanism. The calculated potential energy surfaces discovered the existence of a concerted SN 2-like mechanism. In this mechanism, a nucleophilic attack by O6 facilitated by proton transfer to the catalytic base and the separation of the leaving group all occur almost simultaneously. The transition state for the proposed reaction mechanism at the M06-2X/6-31G** (with diffuse functions on the O1', O5', OGlu , and O6 atoms) level was located at C1-O6=1.74 Å and C1-O1=2.86 Å. The activation energy for this mechanism was estimated to be between 20 and 29 kcal mol⁻¹, depending on the method used. These calculations also identified a low-barrier hydrogen bond between the nucleophile O6H and the catalytic base Glu320, and a hydrogen bond between the N-acetamino group and the glycosidic oxygen of the donor in the TS. It is proposed that these interactions contribute to a

  20. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Pollara, Fabrizio; Hamkins, Jon; Dolinar, Sam; Andrews, Ken; Divsalar, Dariush

    2006-01-01

    This viewgraph presentation reviews uplink coding. The purpose and goals of the briefing are (1) Show a plan for using uplink coding and describe benefits (2) Define possible solutions and their applicability to different types of uplink, including emergency uplink (3) Concur with our conclusions so we can embark on a plan to use proposed uplink system (4) Identify the need for the development of appropriate technology and infusion in the DSN (5) Gain advocacy to implement uplink coding in flight projects Action Item EMB04-1-14 -- Show a plan for using uplink coding, including showing where it is useful or not (include discussion of emergency uplink coding).

  1. An extended version of the SERPENT-2 code to investigate fuel burn-up and core material evolution of the Molten Salt Fast Reactor

    NASA Astrophysics Data System (ADS)

    Aufiero, M.; Cammi, A.; Fiorina, C.; Leppänen, J.; Luzzi, L.; Ricotti, M. E.

    2013-10-01

    In this work, the Monte Carlo burn-up code SERPENT-2 has been extended and employed to study the material isotopic evolution of the Molten Salt Fast Reactor (MSFR). This promising GEN-IV nuclear reactor concept features peculiar characteristics such as the on-line fuel reprocessing, which prevents the use of commonly available burn-up codes. Besides, the presence of circulating nuclear fuel and radioactive streams from the core to the reprocessing plant requires a precise knowledge of the fuel isotopic composition during the plant operation. The developed extension of SERPENT-2 directly takes into account the effects of on-line fuel reprocessing on burn-up calculations and features a reactivity control algorithm. It is here assessed against a dedicated version of the deterministic ERANOS-based EQL3D procedure (PSI-Switzerland) and adopted to analyze the MSFR fuel salt isotopic evolution. Particular attention is devoted to study the effects of reprocessing time constants and efficiencies on the conversion ratio and the molar concentration of elements relevant for solubility issues (e.g., trivalent actinides and lanthanides). Quantities of interest for fuel handling and safety issues are investigated, including decay heat and activities of hazardous isotopes (neutron and high energy gamma emitters) in the core and in the reprocessing stream. The radiotoxicity generation is also analyzed for the MSFR nominal conditions. The production of helium and the depletion in tungsten content due to nuclear reactions are calculated for the nickel-based alloy selected as reactor structural material of the MSFR. These preliminary evaluations can be helpful in studying the radiation damage of both the primary salt container and the axial reflectors.

  2. Understanding independence

    NASA Astrophysics Data System (ADS)

    Annan, James; Hargreaves, Julia

    2016-04-01

    In order to perform any Bayesian processing of a model ensemble, we need a prior over the ensemble members. In the case of multimodel ensembles such as CMIP, the historical approach of ``model democracy'' (i.e. equal weight for all models in the sample) is no longer credible (if it ever was) due to model duplication and inbreeding. The question of ``model independence'' is central to the question of prior weights. However, although this question has been repeatedly raised, it has not yet been satisfactorily addressed. Here I will discuss the issue of independence and present a theoretical foundation for understanding and analysing the ensemble in this context. I will also present some simple examples showing how these ideas may be applied and developed.

  3. Investigation of plant control strategies for the supercritical C0{sub 2}Brayton cycle for a sodium-cooled fast reactor using the plant dynamics code.

    SciTech Connect

    Moisseytsev, A.; Sienicki, J.

    2011-04-12

    The development of a control strategy for the supercritical CO{sub 2} (S-CO{sub 2}) Brayton cycle has been extended to the investigation of alternate control strategies for a Sodium-Cooled Fast Reactor (SFR) nuclear power plant incorporating a S-CO{sub 2} Brayton cycle power converter. The SFR assumed is the 400 MWe (1000 MWt) ABR-1000 preconceptual design incorporating metallic fuel. Three alternative idealized schemes for controlling the reactor side of the plant in combination with the existing automatic control strategy for the S-CO{sub 2} Brayton cycle are explored using the ANL Plant Dynamics Code together with the SAS4A/SASSYS-1 Liquid Metal Reactor (LMR) Analysis Code System coupled together using the iterative coupling formulation previously developed and implemented into the Plant Dynamics Code. The first option assumes that the reactor side can be ideally controlled through movement of control rods and changing the speeds of both the primary and intermediate coolant system sodium pumps such that the intermediate sodium flow rate and inlet temperature to the sodium-to-CO{sub 2} heat exchanger (RHX) remain unvarying while the intermediate sodium outlet temperature changes as the load demand from the electric grid changes and the S-CO{sub 2} cycle conditions adjust according to the S-CO{sub 2} cycle control strategy. For this option, the reactor plant follows an assumed change in load demand from 100 to 0 % nominal at 5 % reduction per minute in a suitable fashion. The second option allows the reactor core power and primary and intermediate coolant system sodium pump flow rates to change autonomously in response to the strong reactivity feedbacks of the metallic fueled core and assumed constant pump torques representing unchanging output from the pump electric motors. The plant behavior to the assumed load demand reduction is surprising close to that calculated for the first option. The only negative result observed is a slight increase in the intermediate

  4. Sharing code.

    PubMed

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  5. An Investigation of the Relationship of Intellective and Personality Variables to Success in an Independent Study Science Course Through the Use of a Modified Multiple Regression Model.

    ERIC Educational Resources Information Center

    Szabo, Michael; Feldhusen, John F.

    This is an empirical study of selected learner characteristics and their relation to academic success, as indicated by course grades, in a structured independent study learning program. This program, called the Audio-Tutorial System, was utilized in an undergraduate college course in the biological sciences. By use of multiple regression analysis,…

  6. The Utility of CBM Written Language Indices: An Investigation of Production-Dependent, Production-Independent, and Accurate-Production Scores

    ERIC Educational Resources Information Center

    Jewell, Jennifer; Malecki, Christine Kerres

    2005-01-01

    This study examined the utility of three categories of CBM written language indices including production-dependent indices (Total Words Written, Words Spelled Correctly, and Correct Writing Sequences), production-independent indices (Percentage of Words Spelled Correctly and Percentage of Correct Writing Sequences), and an accurate-production…

  7. Investigation of the thermal response of a gasdynamic heater with helical impellers. Calspan Report No. 6961-A-1. [MAZE and TACO2D codes

    SciTech Connect

    Rae, W. J.

    1981-12-01

    A gasdynamic heater, capable of producing contamination-free gas streams at temperatures up to 9000/sup 0/K, is being developed by the Vulcan project. The design of a cooling system for the case parts and the associated thermal analysis are a critical part of a successful design. The purpose of the present work was to perform a preliminary cooling passage design and complete thermal analysis for the center body liner, end plate liners and exit nozzle. The approach envisioned for this work was the use of a set of LLNL finite-element codes, called MAZE and TACO2D. These were to be used at LLNL in a series of visits by the Calspan principal investigator. The project was cancelled shortly after the first of these visits; this report contains a summary of the work accomplished during the abbreviated contract period, and a review of the items that will need to be considered when the work is resumed at some future date.

  8. An investigative study of multispectral data compression for remotely-sensed images using vector quantization and difference-mapped shift-coding

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    A study is conducted to investigate the effects and advantages of data compression techniques on multispectral imagery data acquired by NASA's airborne scanners at the Stennis Space Center. The first technique used was vector quantization. The vector is defined in the multispectral imagery context as an array of pixels from the same location from each channel. The error obtained in substituting the reconstructed images for the original set is compared for different compression ratios. Also, the eigenvalues of the covariance matrix obtained from the reconstructed data set are compared with the eigenvalues of the original set. The effects of varying the size of the vector codebook on the quality of the compression and on subsequent classification are also presented. The output data from the Vector Quantization algorithm was further compressed by a lossless technique called Difference-mapped Shift-extended Huffman coding. The overall compression for 7 channels of data acquired by the Calibrated Airborne Multispectral Scanner (CAMS), with an RMS error of 15.8 pixels was 195:1 (0.41 bpp) and with an RMS error of 3.6 pixels was 18:1 (.447 bpp). The algorithms were implemented in software and interfaced with the help of dedicated image processing boards to an 80386 PC compatible computer. Modules were developed for the task of image compression and image analysis. Also, supporting software to perform image processing for visual display and interpretation of the compressed/classified images was developed.

  9. Comet assay in reconstructed 3D human epidermal skin models—investigation of intra- and inter-laboratory reproducibility with coded chemicals

    PubMed Central

    Pfuhler, Stefan

    2013-01-01

    Reconstructed 3D human epidermal skin models are being used increasingly for safety testing of chemicals. Based on EpiDerm™ tissues, an assay was developed in which the tissues were topically exposed to test chemicals for 3h followed by cell isolation and assessment of DNA damage using the comet assay. Inter-laboratory reproducibility of the 3D skin comet assay was initially demonstrated using two model genotoxic carcinogens, methyl methane sulfonate (MMS) and 4-nitroquinoline-n-oxide, and the results showed good concordance among three different laboratories and with in vivo data. In Phase 2 of the project, intra- and inter-laboratory reproducibility was investigated with five coded compounds with different genotoxicity liability tested at three different laboratories. For the genotoxic carcinogens MMS and N-ethyl-N-nitrosourea, all laboratories reported a dose-related and statistically significant increase (P < 0.05) in DNA damage in every experiment. For the genotoxic carcinogen, 2,4-diaminotoluene, the overall result from all laboratories showed a smaller, but significant genotoxic response (P < 0.05). For cyclohexanone (CHN) (non-genotoxic in vitro and in vivo, and non-carcinogenic), an increase compared to the solvent control acetone was observed only in one laboratory. However, the response was not dose related and CHN was judged negative overall, as was p-nitrophenol (p-NP) (genotoxic in vitro but not in vivo and non-carcinogenic), which was the only compound showing clear cytotoxic effects. For p-NP, significant DNA damage generally occurred only at doses that were substantially cytotoxic (>30% cell loss), and the overall response was comparable in all laboratories despite some differences in doses tested. The results of the collaborative study for the coded compounds were generally reproducible among the laboratories involved and intra-laboratory reproducibility was also good. These data indicate that the comet assay in EpiDerm™ skin models is a

  10. Phonological coding during reading

    PubMed Central

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  11. Developing independence.

    PubMed

    Turnbull, A P; Turnbull, H R

    1985-03-01

    The transition from living a life as others want (dependence) to living it as the adolescent wants to live it (independence) is extraordinarily difficult for most teen-agers and their families. The difficulty is compounded in the case of adolescents with disabilities. They are often denied access to the same opportunities of life that are accessible to the nondisabled. They face special problems in augmenting their inherent capacities so that they can take fuller advantage of the accommodations that society makes in an effort to grant them access. In particular, they need training designed to increase their capacities to make, communicate, implement, and evaluate their own life-choices. The recommendations made in this paper are grounded in the long-standing tradition of parens patriae and enlightened paternalism; they seek to be deliberately and cautiously careful about the lives of adolescents with disabilities and their families. We based them on the recent tradition of anti-institutionalism and they are also consistent with some of the major policy directions of the past 15-20 years. These include: normalization, integration, and least-restrictive alternatives; the unity and integrity of the family; the importance of opportunities for self-advocacy; the role of consumer consent and choice in consumer-professional relationships; the need for individualized services; the importance of the developmental model as a basis for service delivery; the value of economic productivity of people with disabilities; and the rights of habilitation, amelioration, and prevention. PMID:3156827

  12. Developing independence.

    PubMed

    Turnbull, A P; Turnbull, H R

    1985-03-01

    The transition from living a life as others want (dependence) to living it as the adolescent wants to live it (independence) is extraordinarily difficult for most teen-agers and their families. The difficulty is compounded in the case of adolescents with disabilities. They are often denied access to the same opportunities of life that are accessible to the nondisabled. They face special problems in augmenting their inherent capacities so that they can take fuller advantage of the accommodations that society makes in an effort to grant them access. In particular, they need training designed to increase their capacities to make, communicate, implement, and evaluate their own life-choices. The recommendations made in this paper are grounded in the long-standing tradition of parens patriae and enlightened paternalism; they seek to be deliberately and cautiously careful about the lives of adolescents with disabilities and their families. We based them on the recent tradition of anti-institutionalism and they are also consistent with some of the major policy directions of the past 15-20 years. These include: normalization, integration, and least-restrictive alternatives; the unity and integrity of the family; the importance of opportunities for self-advocacy; the role of consumer consent and choice in consumer-professional relationships; the need for individualized services; the importance of the developmental model as a basis for service delivery; the value of economic productivity of people with disabilities; and the rights of habilitation, amelioration, and prevention.

  13. True uniaxial compressive strengths of rock or coal specimens are independent of diameter-to-length ratios. Report of Investigations/1990

    SciTech Connect

    Babcock, C.O.

    1990-01-01

    Part of the compressive strength of a test specimen of rock or coal in the laboratory or a pillar in a mine comes from physical property strength and, in part, from the constraint provided by the loading stresses. Much confusion in pillar design comes from assigning the total strength change to geometry, as evidenced by the many pillar design equations with width to height as the primary variable. In tests by the U.S. Bureau of Mines, compressive strengths for cylindrical specimens of limestone, marble, sandstone, and coal were independent of the specimen test geometry when the end friction was removed. A conventional uniaxial compressive strength test between two steel platens is actually a uniaxial force and not a uniaxial stress test. The biaxial or triaxial state of stress for much of the test volume changes with the geometry of the test specimen. By removing the end friction supplied by the steel platens to the specimen, a more nearly uniaxial stress state independent of the specimen geometry is produced in the specimen. Pillar design is a constraint and physical property problem rather than a geometry problem. Roof and floor constraint are major factors in pillar design and strength.

  14. Contemporary accuracy of death certificates for coding prostate cancer as a cause of death: Is reliance on death certification good enough? A comparison with blinded review by an independent cause of death evaluation committee

    PubMed Central

    Turner, Emma L; Metcalfe, Chris; Donovan, Jenny L; Noble, Sian; Sterne, Jonathan A C; Lane, J Athene; I Walsh, Eleanor; Hill, Elizabeth M; Down, Liz; Ben-Shlomo, Yoav; Oliver, Steven E; Evans, Simon; Brindle, Peter; Williams, Naomi J; Hughes, Laura J; Davies, Charlotte F; Ng, Siaw Yein; Neal, David E; Hamdy, Freddie C; Albertsen, Peter; Reid, Colette M; Oxley, Jon; McFarlane, John; Robinson, Mary C; Adolfsson, Jan; Zietman, Anthony; Baum, Michael; Koupparis, Anthony; Martin, Richard M

    2016-01-01

    Background: Accurate cause of death assignment is crucial for prostate cancer epidemiology and trials reporting prostate cancer-specific mortality outcomes. Methods: We compared death certificate information with independent cause of death evaluation by an expert committee within a prostate cancer trial (2002–2015). Results: Of 1236 deaths assessed, expert committee evaluation attributed 523 (42%) to prostate cancer, agreeing with death certificate cause of death in 1134 cases (92%, 95% CI: 90%, 93%). The sensitivity of death certificates in identifying prostate cancer deaths as classified by the committee was 91% (95% CI: 89%, 94%); specificity was 92% (95% CI: 90%, 94%). Sensitivity and specificity were lower where death occurred within 1 year of diagnosis, and where there was another primary cancer diagnosis. Conclusions: UK death certificates accurately identify cause of death in men with prostate cancer, supporting their use in routine statistics. Possible differential misattribution by trial arm supports independent evaluation in randomised trials. PMID:27253172

  15. Two-terminal video coding.

    PubMed

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  16. Functional Investigation of a Non-coding Variant Associated with Adolescent Idiopathic Scoliosis in Zebrafish: Elevated Expression of the Ladybird Homeobox Gene Causes Body Axis Deformation

    PubMed Central

    Guo, Long; Yamashita, Hiroshi; Kou, Ikuyo; Takimoto, Aki; Meguro-Horike, Makiko; Horike, Shin-ichi; Sakuma, Tetsushi; Miura, Shigenori; Adachi, Taiji; Yamamoto, Takashi; Ikegawa, Shiro; Hiraki, Yuji; Shukunami, Chisa

    2016-01-01

    Previously, we identified an adolescent idiopathic scoliosis susceptibility locus near human ladybird homeobox 1 (LBX1) and FLJ41350 by a genome-wide association study. Here, we characterized the associated non-coding variant and investigated the function of these genes. A chromosome conformation capture assay revealed that the genome region with the most significantly associated single nucleotide polymorphism (rs11190870) physically interacted with the promoter region of LBX1-FLJ41350. The promoter in the direction of LBX1, combined with a 590-bp region including rs11190870, had higher transcriptional activity with the risk allele than that with the non-risk allele in HEK 293T cells. The ubiquitous overexpression of human LBX1 or either of the zebrafish lbx genes (lbx1a, lbx1b, and lbx2), but not FLJ41350, in zebrafish embryos caused body curvature followed by death prior to vertebral column formation. Such body axis deformation was not observed in transcription activator-like effector nucleases mediated knockout zebrafish of lbx1b or lbx2. Mosaic expression of lbx1b driven by the GATA2 minimal promoter and the lbx1b enhancer in zebrafish significantly alleviated the embryonic lethal phenotype to allow observation of the later onset of the spinal curvature with or without vertebral malformation. Deformation of the embryonic body axis by lbx1b overexpression was associated with defects in convergent extension, which is a component of the main axis-elongation machinery in gastrulating embryos. In embryos overexpressing lbx1b, wnt5b, a ligand of the non-canonical Wnt/planar cell polarity (PCP) pathway, was significantly downregulated. Injection of mRNA for wnt5b or RhoA, a key downstream effector of Wnt/PCP signaling, rescued the defective convergent extension phenotype and attenuated the lbx1b-induced curvature of the body axis. Thus, our study presents a novel pathological feature of LBX1 and its zebrafish homologs in body axis deformation at various stages of

  17. Material-dependent and material-independent selection processes in the frontal and parietal lobes: an event-related fMRI investigation of response competition

    NASA Technical Reports Server (NTRS)

    Hazeltine, Eliot; Bunge, Silvia A.; Scanlon, Michael D.; Gabrieli, John D E.

    2003-01-01

    The present study used the flanker task [Percept. Psychophys. 16 (1974) 143] to identify neural structures that support response selection processes, and to determine which of these structures respond differently depending on the type of stimulus material associated with the response. Participants performed two versions of the flanker task while undergoing event-related functional magnetic resonance imaging (fMRI). Both versions of the task required participants to respond to a central stimulus regardless of the responses associated with simultaneously presented flanking stimuli, but one used colored circle stimuli and the other used letter stimuli. Competition-related activation was identified by comparing Incongruent trials, in which the flanker stimuli indicated a different response than the central stimulus, to Neutral stimuli, in which the flanker stimuli indicated no response. A region within the right inferior frontal gyrus exhibited significantly more competition-related activation for the color stimuli, whereas regions within the middle frontal gyri of both hemispheres exhibited more competition-related activation for the letter stimuli. The border of the right middle frontal and inferior frontal gyri and the anterior cingulate cortex (ACC) were significantly activated by competition for both types of stimulus materials. Posterior foci demonstrated a similar pattern: left inferior parietal cortex showed greater competition-related activation for the letters, whereas right parietal cortex was significantly activated by competition for both materials. These findings indicate that the resolution of response competition invokes both material-dependent and material-independent processes.

  18. An investigation of herpes simplex virus promoter activity compatible with latency establishment reveals VP16-independent activation of immediate-early promoters in sensory neurones.

    PubMed

    Proença, João T; Coleman, Heather M; Nicoll, Michael P; Connor, Viv; Preston, Christopher M; Arthur, Jane; Efstathiou, Stacey

    2011-11-01

    Herpes simplex virus (HSV) type-1 establishes lifelong latency in sensory neurones and it is widely assumed that latency is the consequence of a failure to initiate virus immediate-early (IE) gene expression. However, using a Cre reporter mouse system in conjunction with Cre-expressing HSV-1 recombinants we have previously shown that activation of the IE ICP0 promoter can precede latency establishment in at least 30% of latently infected cells. During productive infection of non-neuronal cells, IE promoter activation is largely dependent on the transactivator VP16 a late structural component of the virion. Of significance, VP16 has recently been shown to exhibit altered regulation in neurones; where its de novo synthesis is necessary for IE gene expression during both lytic infection and reactivation from latency. In the current study, we utilized the Cre reporter mouse model system to characterize the full extent of viral promoter activity compatible with cell survival and latency establishment. In contrast to the high frequency activation of representative IE promoters prior to latency establishment, cell marking using a virus recombinant expressing Cre under VP16 promoter control was very inefficient. Furthermore, infection of neuronal cultures with VP16 mutants reveals a strong VP16 requirement for IE promoter activity in non-neuronal cells, but not sensory neurones. We conclude that only IE promoter activation can efficiently precede latency establishment and that this activation is likely to occur through a VP16-independent mechanism. PMID:21752961

  19. Evaluating Reanalysis - Independent Observations and Observation Independence

    NASA Astrophysics Data System (ADS)

    Wahl, S.; Bollmeyer, C.; Danek, C.; Friederichs, P.; Keller, J. D.; Ohlwein, C.

    2014-12-01

    Reanalyses on global to regional scales are widely used for validation of meteorological or hydrological models and for many climate applications. However, the evaluation of the reanalyses itself is still a crucial task. A major challenge is the lack of independent observations, since most of the available observational data is already included, e. g. by the data assimilation scheme. Here, we focus on the evaluation of dynamical reanalyses which are obtained by using numerical weather prediction models with a fixed data assimilation scheme. Precipitation is generally not assimilated in dynamical reanalyses (except for e.g. latent heat nudging) and thereby provides valuable data for the evaluation of reanalysis. Since precipitation results from the complex dynamical and microphysical atmospheric processes, an accurate representation of precipitation is often used as an indicator for a good model performance. Here, we use independent observations of daily precipitation accumulations from European rain gauges (E-OBS) of the years 2008 and 2009 for the intercomparison of various regional reanalyses products for the European CORDEX domain (Hirlam reanalysis at 0.2°, Metoffice UM reanalysis at 0.11°, COSMO reanalysis at 0.055°). This allows for assessing the benefits of increased horizontal resolution compared to global reanalyses. Furthermore, the effect of latent heat nudging (assimilation of radar-derived rain rates) is investigated using an experimental setup of the COSMO reanalysis with 6km and 2km resolution for summer 2011. Further, we present an observation independent evaluation based on kinetic energy spectra. Such spectra should follow a k-3 dependence of the wave number k for the larger scale, and a k-5/3 dependence on the mesoscale. We compare the spectra of the aforementioned regional reanalyses in order to investigate the general capability of the reanalyses to resolve events on the mesoscale (e.g. effective resolution). The intercomparison and

  20. Is ADHD a Risk Factor Independent of Conduct Disorder for Illicit Substance Use? A Meta-Analysis and Meta-Regression Investigation

    ERIC Educational Resources Information Center

    Serra-Pinheiro, Maria Antonia; Coutinho, Evandro S. F.; Souza, Isabella S.; Pinna, Camilla; Fortes, Didia; Araujo, Catia; Szobot, Claudia M.; Rohde, Luis A.; Mattos, Paulo

    2013-01-01

    Objective: To investigate meta-analytically if the association between ADHD and illicit substance use (ISU) is maintained when controlling for conduct disorder/oppositional-defiant disorder (CD/ODD). Method: A systematic literature review was conducted through Medline from 1980 to 2008. Data extracted and selections made by one author were…

  1. A dynamic population model to investigate effects of climate and climate-independent factors on the lifecycle of the tick Amblyomma americanum (Acari: Ixodidae)

    USGS Publications Warehouse

    Ludwig, Antoinette; Ginsberg, Howard; Hickling, Graham J.; Ogden, Nicholas H.

    2015-01-01

    The lone star tick, Amblyomma americanum, is a disease vector of significance for human and animal health throughout much of the eastern United States. To model the potential effects of climate change on this tick, a better understanding is needed of the relative roles of temperature-dependent and temperature-independent (day-length-dependent behavioral or morphogenetic diapause) processes acting on the tick lifecycle. In this study, we explored the roles of these processes by simulating seasonal activity patterns using models with site-specific temperature and day-length-dependent processes. We first modeled the transitions from engorged larvae to feeding nymphs, engorged nymphs to feeding adults, and engorged adult females to feeding larvae. The simulated seasonal patterns were compared against field observations at three locations in United States. Simulations suggested that 1) during the larva-to-nymph transition, some larvae undergo no diapause while others undergo morphogenetic diapause of engorged larvae; 2) molted adults undergo behavioral diapause during the transition from nymph-to-adult; and 3) there is no diapause during the adult-to-larva transition. A model constructed to simulate the full lifecycle of A. americanum successfully predicted observed tick activity at the three U.S. study locations. Some differences between observed and simulated seasonality patterns were observed, however, identifying the need for research to refine some model parameters. In simulations run using temperature data for Montreal, deterministic die-out of A. americanum populations did not occur, suggesting the possibility that current climate in parts of southern Canada is suitable for survival and reproduction of this tick.

  2. A Dynamic Population Model to Investigate Effects of Climate and Climate-Independent Factors on the Lifecycle of Amblyomma americanum (Acari: Ixodidae).

    PubMed

    Ludwig, Antoinette; Ginsberg, Howard S; Hickling, Graham J; Ogden, Nicholas H

    2016-01-01

    The lone star tick, Amblyomma americanum, is a disease vector of significance for human and animal health throughout much of the eastern United States. To model the potential effects of climate change on this tick, a better understanding is needed of the relative roles of temperature-dependent and temperature-independent (day-length-dependent behavioral or morphogenetic diapause) processes acting on the tick lifecycle. In this study, we explored the roles of these processes by simulating seasonal activity patterns using models with site-specific temperature and day-length-dependent processes. We first modeled the transitions from engorged larvae to feeding nymphs, engorged nymphs to feeding adults, and engorged adult females to feeding larvae. The simulated seasonal patterns were compared against field observations at three locations in United States. Simulations suggested that 1) during the larva-to-nymph transition, some larvae undergo no diapause while others undergo morphogenetic diapause of engorged larvae; 2) molted adults undergo behavioral diapause during the transition from nymph-to-adult; and 3) there is no diapause during the adult-to-larva transition. A model constructed to simulate the full lifecycle of A. americanum successfully predicted observed tick activity at the three U.S. study locations. Some differences between observed and simulated seasonality patterns were observed, however, identifying the need for research to refine some model parameters. In simulations run using temperature data for Montreal, deterministic die-out of A. americanum populations did not occur, suggesting the possibility that current climate in parts of southern Canada is suitable for survival and reproduction of this tick.

  3. MCNP code

    SciTech Connect

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids.

  4. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  5. Constructions for finite-state codes

    NASA Technical Reports Server (NTRS)

    Pollara, F.; Mceliece, R. J.; Abdel-Ghaffar, K.

    1987-01-01

    A class of codes called finite-state (FS) codes is defined and investigated. These codes, which generalize both block and convolutional codes, are defined by their encoders, which are finite-state machines with parallel inputs and outputs. A family of upper bounds on the free distance of a given FS code is derived from known upper bounds on the minimum distance of block codes. A general construction for FS codes is then given, based on the idea of partitioning a given linear block into cosets of one of its subcodes, and it is shown that in many cases the FS codes constructed in this way have a d sub free which is as large as possible. These codes are found without the need for lengthy computer searches, and have potential applications for future deep-space coding systems. The issue of catastropic error propagation (CEP) for FS codes is also investigated.

  6. Cell-free synthesis of functional human epidermal growth factor receptor: Investigation of ligand-independent dimerization in Sf21 microsomal membranes using non-canonical amino acids

    PubMed Central

    Quast, Robert B.; Ballion, Biljana; Stech, Marlitt; Sonnabend, Andrei; Varga, Balázs R.; Wüstenhagen, Doreen A.; Kele, Péter; Schiller, Stefan M.; Kubick, Stefan

    2016-01-01

    Cell-free protein synthesis systems represent versatile tools for the synthesis and modification of human membrane proteins. In particular, eukaryotic cell-free systems provide a promising platform for their structural and functional characterization. Here, we present the cell-free synthesis of functional human epidermal growth factor receptor and its vIII deletion mutant in a microsome-containing system derived from cultured Sf21 cells. We provide evidence for embedment of cell-free synthesized receptors into microsomal membranes and asparagine-linked glycosylation. Using the cricket paralysis virus internal ribosome entry site and a repetitive synthesis approach enrichment of receptors inside the microsomal fractions was facilitated thereby providing analytical amounts of functional protein. Receptor tyrosine kinase activation was demonstrated by monitoring receptor phosphorylation. Furthermore, an orthogonal cell-free translation system that provides the site-directed incorporation of p-azido-L-phenylalanine is characterized and applied to investigate receptor dimerization in the absence of a ligand by photo-affinity cross-linking. Finally, incorporated azides are used to generate stable covalently linked receptor dimers by strain-promoted cycloaddition using a novel linker system. PMID:27670253

  7. The effects of a conceptual change coupled-inquiry cycle investigation on student understanding of the independence of mass in rolling motion on an incline plane

    NASA Astrophysics Data System (ADS)

    Rowley, Eric Noel

    The Conceptual Change Coupled-Inquiry Cycle is designed to incorporate learning cycle, inquiry, and conceptual change instructional models. The purpose of this study was to examine the impact of the Conceptual Change Coupled-Inquiry Cycle on first-year, high school students' misconceptions of Newton's Laws and incline motion. This study was a mixed-method, quasi-experimental study with both quantitative and qualitative data analyses. Student notebook and test data were collected and analyzed in this study. Quantitative and qualitative analytical methods were utilized in the analysis of these data. A Stuart-Maxwell chi-square was used to assess the quantitative significance of changes in student conceptual understanding of incline motion at each phase of the Conceptual Change Coupled-Inquiry Cycle. Qualitative analysis of the notebooks provided important support of the quantitative findings. Results indicate that students report a better understanding of incline motion and Newton's Laws as a result of completing a Conceptual Change Coupled-Inquiry Cycle investigation. Furthermore, quantitative analysis of the notebooks, using the Stuart-Maxwell chi-square test, indicate significant increases in student understanding of Newton's Laws and incline motion, at the alpha = 0.05 level. Analysis of student test data was largely inconclusive. This study indicates the Conceptual Change Coupled-Inquiry Cycle helps students better understand incline motion and Newton's Laws. Significant decreases in the number of students reporting misconceptions about incline motion were evident. Evidence suggests the Conceptual Change Coupled-Inquiry Cycle is an effective learning cycle and that it can improve student understanding of science concepts.

  8. Experimental investigation of neutronic characteristics of the IR-8 reactor to confirm the results of calculations by MCU-PTR code

    SciTech Connect

    Surkov, A. V. Kochkin, V. N.; Pesnya, Yu. E.; Nasonov, V. A.; Vihrov, V. I.; Erak, D. Yu.

    2015-12-15

    A comparison of measured and calculated neutronic characteristics (fast neutron flux and fission rate of {sup 235}U) in the core and reflector of the IR-8 reactor is presented. The irradiation devices equipped with neutron activation detectors were prepared. The determination of fast neutron flux was performed using the {sup 54}Fe (n, p) and {sup 58}Ni (n, p) reactions. The {sup 235}U fission rate was measured using uranium dioxide with 10% enrichment in {sup 235}U. The determination of specific activities of detectors was carried out by measuring the intensity of characteristic gamma peaks using the ORTEC gamma spectrometer. Neutron fields in the core and reflector of the IR-8 reactor were calculated using the MCU-PTR code.

  9. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  10. Implementation issues in source coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.

    1989-01-01

    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

  11. The Integrated TIGER Series Codes

    SciTech Connect

    Kensek, Ronald P.; Franke, Brian C.; Laub, Thomas W.

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  12. The Integrated TIGER Series Codes

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with anmore » input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.« less

  13. Independent Peer Reviews

    SciTech Connect

    2012-03-16

    Independent Assessments: DOE's Systems Integrator convenes independent technical reviews to gauge progress toward meeting specific technical targets and to provide technical information necessary for key decisions.

  14. Covariance Matrix Evaluations for Independent Mass Fission Yields

    SciTech Connect

    Terranova, N.; Serot, O.; Archier, P.; De Saint Jean, C.

    2015-01-15

    Recent needs for more accurate fission product yields include covariance information to allow improved uncertainty estimations of the parameters used by design codes. The aim of this work is to investigate the possibility to generate more reliable and complete uncertainty information on independent mass fission yields. Mass yields covariances are estimated through a convolution between the multi-Gaussian empirical model based on Brosa's fission modes, which describe the pre-neutron mass yields, and the average prompt neutron multiplicity curve. The covariance generation task has been approached using the Bayesian generalized least squared method through the CONRAD code. Preliminary results on mass yields variance-covariance matrix will be presented and discussed from physical grounds in the case of {sup 235}U(n{sub th}, f) and {sup 239}Pu(n{sub th}, f) reactions.

  15. Covariance Matrix Evaluations for Independent Mass Fission Yields

    NASA Astrophysics Data System (ADS)

    Terranova, N.; Serot, O.; Archier, P.; De Saint Jean, C.; Sumini, M.

    2015-01-01

    Recent needs for more accurate fission product yields include covariance information to allow improved uncertainty estimations of the parameters used by design codes. The aim of this work is to investigate the possibility to generate more reliable and complete uncertainty information on independent mass fission yields. Mass yields covariances are estimated through a convolution between the multi-Gaussian empirical model based on Brosa's fission modes, which describe the pre-neutron mass yields, and the average prompt neutron multiplicity curve. The covariance generation task has been approached using the Bayesian generalized least squared method through the CONRAD code. Preliminary results on mass yields variance-covariance matrix will be presented and discussed from physical grounds in the case of 235U(nth, f) and 239Pu(nth, f) reactions.

  16. Role of long non-coding RNA HULC in cell proliferation, apoptosis and tumor metastasis of gastric cancer: a clinical and in vitro investigation.

    PubMed

    Zhao, Yan; Guo, Qinhao; Chen, Jiejing; Hu, Jun; Wang, Shuwei; Sun, Yueming

    2014-01-01

    Long non-coding RNAs (lncRNAs) are emerging as key molecules in human cancer. Highly upregulated in liver cancer (HULC), an lncRNA, has recently been revealed to be involved in hepatocellular carcinoma development and progression. It remains unclear, however, whether HULC plays an oncogenic role in human gastric cancer (GC). In the present study, we demonstrated that HULC was significantly overexpressed in GC cell lines and GC tissues compared with normal controls, and this overexpression was correlated with lymph node metastasis, distant metastasis and advanced tumor node metastasis stages. In addition, a receiver operating characteristic (ROC) curve was constructed to evaluate the diagnostic values and the area under the ROC curve of HULC was up to 0.769. To uncover its functional importance, gain- and loss-of-function studies were performed to evaluate the effect of HULC on cell proliferation, apoptosis and invasion in vitro. Overexpression of HULC promoted proliferation and invasion and inhibited cell apoptosis in SGC7901 cells, while knockdown of HULC in SGC7901 cells showed the opposite effect. Mechanistically, we discovered that overexpression of HULC could induce patterns of autophagy in SGC7901 cells; more importantly, autophagy inhibition increased overexpression of HULC cell apoptosis. We also determined that silencing of HULC effectively reversed the epithelial-to-mesenchymal transition (EMT) phenotype. In summary, our results suggest that HULC may play an important role in the growth and tumorigenesis of human GC, which provides us with a new biomarker in GC and perhaps a potential target for GC prevention, diagnosis and therapeutic treatment.

  17. A preliminary investigation of Large Eddy Simulation (LES) of the flow around a cylinder at ReD = 3900 using a commercial CFD code

    SciTech Connect

    Paschkewitz, J S

    2006-02-14

    Engineering fluid mechanics simulations at high Reynolds numbers have traditionally been performed using the Reynolds-Averaged Navier Stokes (RANS) equations and a turbulence model. The RANS methodology has well-documented shortcomings in the modeling of separated or bluff body wake flows that are characterized by unsteady vortex shedding. The resulting turbulence statistics are strongly influenced by the detailed structure and dynamics of the large eddies, which are poorly captured using RANS models (Rodi 1997; Krishnan et al. 2004). The Large Eddy Simulation (LES) methodology offers the potential to more accurately simulate these flows as it resolves the large-scale unsteady motions and entails modeling of only the smallest-scale turbulence structures. Commercial computational fluid dynamics products are beginning to offer LES capability, allowing practicing engineers an opportunity to apply this turbulence modeling technique to much wider array of problems than in dedicated research codes. Here, we present a preliminary evaluation of the LES capability in the commercial CFD solver StarCD by simulating the flow around a cylinder at a Reynolds number based on the cylinder diameter, D, of 3900 using the constant coefficient Smagorinsky LES model. The results are compared to both the experimental and computational results provided in Kravchenko & Moin (2000). We find that StarCD provides predictions of lift and drag coefficients that are within 15% of the experimental values. Reasonable agreement is obtained between the time-averaged velocity statistics and the published data. The differences in these metrics may be due to the use of a truncated domain in the spanwise direction and the short time-averaging period used for the statistics presented here. The instantaneous flow field visualizations show a coarser, larger-scale structure than the study of Kravchenko & Moin (2000), which may be a product of the LES implementation or of the domain and resolution used

  18. Regularized robust coding for face recognition.

    PubMed

    Yang, Meng; Zhang, Lei; Yang, Jian; Zhang, David

    2013-05-01

    Recently the sparse representation based classification (SRC) has been proposed for robust face recognition (FR). In SRC, the testing image is coded as a sparse linear combination of the training samples, and the representation fidelity is measured by the l2-norm or l1 -norm of the coding residual. Such a sparse coding model assumes that the coding residual follows Gaussian or Laplacian distribution, which may not be effective enough to describe the coding residual in practical FR systems. Meanwhile, the sparsity constraint on the coding coefficients makes the computational cost of SRC very high. In this paper, we propose a new face coding model, namely regularized robust coding (RRC), which could robustly regress a given signal with regularized regression coefficients. By assuming that the coding residual and the coding coefficient are respectively independent and identically distributed, the RRC seeks for a maximum a posterior solution of the coding problem. An iteratively reweighted regularized robust coding (IR(3)C) algorithm is proposed to solve the RRC model efficiently. Extensive experiments on representative face databases demonstrate that the RRC is much more effective and efficient than state-of-the-art sparse representation based methods in dealing with face occlusion, corruption, lighting, and expression changes, etc.

  19. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    PubMed

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control.

  20. On the error probability of general tree and trellis codes with applications to sequential decoding

    NASA Technical Reports Server (NTRS)

    Johannesson, R.

    1973-01-01

    An upper bound on the average error probability for maximum-likelihood decoding of the ensemble of random binary tree codes is derived and shown to be independent of the length of the tree. An upper bound on the average error probability for maximum-likelihood decoding of the ensemble of random L-branch binary trellis codes of rate R = 1/n is derived which separates the effects of the tail length T and the memory length M of the code. It is shown that the bound is independent of the length L of the information sequence. This implication is investigated by computer simulations of sequential decoding utilizing the stack algorithm. These simulations confirm the implication and further suggest an empirical formula for the true undetected decoding error probability with sequential decoding.

  1. Homological stabilizer codes

    SciTech Connect

    Anderson, Jonas T.

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  2. Investigating the Influence of the Added Mass Effect to Marine Hydrokinetic Horizontal-Axis Turbines Using a General Dynamic Wake Wind Turbine Code: Preprint

    SciTech Connect

    Maniaci, D. C.; Li, Y.

    2012-04-01

    This paper describes a recent study to investigate the applicability of a horizontal-axis wind turbine (HAWT) structural dynamics and unsteady aerodynamics analysis program (FAST and AeroDyn respectively) to modeling the forces on marine hydrokinetic (MHK) turbines. It summarizes the added mass model that has been added to AeroDyn. The added mass model only includes flow acceleration perpendicular to the rotor disc, and ignores added mass forces caused by blade deflection. A model of the National Renewable Energy Laboratory's (NREL) Unsteady Aerodynamics Experiment (UAE) Phase VI wind turbine was analyzed using FAST and AeroDyn with sea water conditions and the new added mass model. The results of this analysis exhibited a 3.6% change in thrust for a rapid pitch case and a slight change in amplitude and phase of thrust for a case with 30 degrees of yaw.

  3. Investigating the Influence of the Added Mass Effect to Marine Hydrokinetic Horizontal-Axis Turbines Using a General Dynamic Wake Wind Turbine Code

    SciTech Connect

    Maniaci, D. C.; Li, Y.

    2011-10-01

    This paper describes a recent study to investigate the applicability of a horizontal-axis wind turbine (HAWT) structural dynamics and unsteady aerodynamics analysis program (FAST and AeroDyn respectively) to modeling the forces on marine hydrokinetic (MHK) turbines. This paper summarizes the added mass model that has been added to AeroDyn. The added mass model only includes flow acceleration perpendicular to the rotor disc, and ignores added mass forces caused by blade deflection. A model of the National Renewable Energy Laboratory's (NREL) Unsteady Aerodynamics Experiment (UAE) Phase VI wind turbine was analyzed using FAST and AeroDyn with sea water conditions and the new added mass model. The results of this analysis exhibited a 3.6% change in thrust for a rapid pitch case and a slight change in amplitude and phase of thrust for a case with 30{sup o} of yaw.

  4. Investigating the impact of parental status and depression symptoms on the early perceptual coding of infant faces: an event-related potential study.

    PubMed

    Noll, Laura K; Mayes, Linda C; Rutherford, Helena J V

    2012-01-01

    Infant faces are highly salient social stimuli that appear to elicit intuitive parenting behaviors in healthy adult women. Behavioral and observational studies indicate that this effect may be modulated by experiences of reproduction, caregiving, and psychiatric symptomatology that affect normative attention and reward processing of infant cues. However, relatively little is known about the neural correlates of these effects. Using the event-related potential (ERP) technique, this study investigated the impact of parental status (mother, non-mother) and depression symptoms on early visual processing of infant faces in a community sample of adult women. Specifically, the P1 and N170 ERP components elicited in response to infant face stimuli were examined. While characteristics of the N170 were not modulated by parental status, a statistically significant positive correlation was observed between depression symptom severity and N170 amplitude. This relationship was not observed for the P1. These results suggest that depression symptoms may modulate early neurophysiological responsiveness to infant cues, even at sub-clinical levels. PMID:22435403

  5. Coding of Neuroinfectious Diseases.

    PubMed

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue. PMID:26633789

  6. Model Children's Code.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  7. To Code or Not To Code?

    ERIC Educational Resources Information Center

    Parkinson, Brian; Sandhu, Parveen; Lacorte, Manel; Gourlay, Lesley

    1998-01-01

    This article considers arguments for and against the use of coding systems in classroom-based language research and touches on some relevant considerations from ethnographic and conversational analysis approaches. The four authors each explain and elaborate on their practical decision to code or not to code events or utterances at a specific point…

  8. Exceptional error minimization in putative primordial genetic codes

    PubMed Central

    2009-01-01

    Background The standard genetic code is redundant and has a highly non-random structure. Codons for the same amino acids typically differ only by the nucleotide in the third position, whereas similar amino acids are encoded, mostly, by codon series that differ by a single base substitution in the third or the first position. As a result, the code is highly albeit not optimally robust to errors of translation, a property that has been interpreted either as a product of selection directed at the minimization of errors or as a non-adaptive by-product of evolution of the code driven by other forces. Results We investigated the error-minimization properties of putative primordial codes that consisted of 16 supercodons, with the third base being completely redundant, using a previously derived cost function and the error minimization percentage as the measure of a code's robustness to mistranslation. It is shown that, when the 16-supercodon table is populated with 10 putative primordial amino acids, inferred from the results of abiotic synthesis experiments and other evidence independent of the code's evolution, and with minimal assumptions used to assign the remaining supercodons, the resulting 2-letter codes are nearly optimal in terms of the error minimization level. Conclusion The results of the computational experiments with putative primordial genetic codes that contained only two meaningful letters in all codons and encoded 10 to 16 amino acids indicate that such codes are likely to have been nearly optimal with respect to the minimization of translation errors. This near-optimality could be the outcome of extensive early selection during the co-evolution of the code with the primordial, error-prone translation system, or a result of a unique, accidental event. Under this hypothesis, the subsequent expansion of the code resulted in a decrease of the error minimization level that became sustainable owing to the evolution of a high-fidelity translation system

  9. Energy efficient rateless codes for high speed data transfer over free space optical channels

    NASA Astrophysics Data System (ADS)

    Prakash, Geetha; Kulkarni, Muralidhar; Acharya, U. S.

    2015-03-01

    Terrestrial Free Space Optical (FSO) links transmit information by using the atmosphere (free space) as a medium. In this paper, we have investigated the use of Luby Transform (LT) codes as a means to mitigate the effects of data corruption induced by imperfect channel which usually takes the form of lost or corrupted packets. LT codes, which are a class of Fountain codes, can be used independent of the channel rate and as many code words as required can be generated to recover all the message bits irrespective of the channel performance. Achieving error free high data rates with limited energy resources is possible with FSO systems if error correction codes with minimal overheads on the power can be used. We also employ a combination of Binary Phase Shift Keying (BPSK) with provision for modification of threshold and optimized LT codes with belief propagation for decoding. These techniques provide additional protection even under strong turbulence regimes. Automatic Repeat Request (ARQ) is another method of improving link reliability. Performance of ARQ is limited by the number of retransmissions and the corresponding time delay. We prove through theoretical computations and simulations that LT codes consume less energy per bit. We validate the feasibility of using energy efficient LT codes over ARQ for FSO links to be used in optical wireless sensor networks within the eye safety limits.

  10. The Proteomic Code: a molecular recognition code for proteins

    PubMed Central

    Biro, Jan C

    2007-01-01

    Background The Proteomic Code is a set of rules by which information in genetic material is transferred into the physico-chemical properties of amino acids. It determines how individual amino acids interact with each other during folding and in specific protein-protein interactions. The Proteomic Code is part of the redundant Genetic Code. Review The 25-year-old history of this concept is reviewed from the first independent suggestions by Biro and Mekler, through the works of Blalock, Root-Bernstein, Siemion, Miller and others, followed by the discovery of a Common Periodic Table of Codons and Nucleic Acids in 2003 and culminating in the recent conceptualization of partial complementary coding of interacting amino acids as well as the theory of the nucleic acid-assisted protein folding. Methods and conclusions A novel cloning method for the design and production of specific, high-affinity-reacting proteins (SHARP) is presented. This method is based on the concept of proteomic codes and is suitable for large-scale, industrial production of specifically interacting peptides. PMID:17999762

  11. Non-White, No More: Effect Coding as an Alternative to Dummy Coding with Implications for Higher Education Researchers

    ERIC Educational Resources Information Center

    Mayhew, Matthew J.; Simonoff, Jeffrey S.

    2015-01-01

    The purpose of this article is to describe effect coding as an alternative quantitative practice for analyzing and interpreting categorical, race-based independent variables in higher education research. Unlike indicator (dummy) codes that imply that one group will be a reference group, effect codes use average responses as a means for…

  12. Bare Code Reader

    NASA Astrophysics Data System (ADS)

    Clair, Jean J.

    1980-05-01

    The Bare code system will be used, in every market and supermarket. The code, which is normalised in US and Europe (code EAN) gives informations on price, storage, nature and allows in real time the gestion of theshop.

  13. CONTAIN independent peer review

    SciTech Connect

    Boyack, B.E.; Corradini, M.L.; Denning, R.S.; Khatib-Rahbar, M.; Loyalka, S.K.; Smith, P.N.

    1995-01-01

    The CONTAIN code was developed by Sandia National Laboratories under the sponsorship of the US Nuclear Regulatory Commission (NRC) to provide integrated analyses of containment phenomena. It is used to predict nuclear reactor containment loads, radiological source terms, and associated physical phenomena for a range of accident conditions encompassing both design-basis and severe accidents. The code`s targeted applications include support for containment-related experimental programs, light water and advanced light water reactor plant analysis, and analytical support for resolution of specific technical issues such as direct containment heating. The NRC decided that a broad technical review of the code should be performed by technical experts to determine its overall technical adequacy. For this purpose, a six-member CONTAIN Peer Review Committee was organized and a peer review as conducted. While the review was in progress, the NRC issued a draft ``Revised Severe Accident Code Strategy`` that incorporated revised design objectives and targeted applications for the CONTAIN code. The committee continued its effort to develop findings relative to the original NRC statement of design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications were considered by the Committee in assigning priorities to the Committee`s recommendations. The Committee determined some improvements are warranted and provided recommendations in five code-related areas: (1) documentation, (2) user guidance, (3) modeling capability, (4) code assessment, and (5) technical assessment.

  14. AEST: Adaptive Eigenvalue Stability Code

    NASA Astrophysics Data System (ADS)

    Zheng, L.-J.; Kotschenreuther, M.; Waelbroeck, F.; van Dam, J. W.; Berk, H.

    2002-11-01

    An adaptive eigenvalue linear stability code is developed. The aim is on one hand to include the non-ideal MHD effects into the global MHD stability calculation for both low and high n modes and on the other hand to resolve the numerical difficulty involving MHD singularity on the rational surfaces at the marginal stability. Our code follows some parts of philosophy of DCON by abandoning relaxation methods based on radial finite element expansion in favor of an efficient shooting procedure with adaptive gridding. The δ W criterion is replaced by the shooting procedure and subsequent matrix eigenvalue problem. Since the technique of expanding a general solution into a summation of the independent solutions employed, the rank of the matrices involved is just a few hundreds. This makes easier to solve the eigenvalue problem with non-ideal MHD effects, such as FLR or even full kinetic effects, as well as plasma rotation effect, taken into account. To include kinetic effects, the approach of solving for the distribution function as a local eigenvalue ω problem as in the GS2 code will be employed in the future. Comparison of the ideal MHD version of the code with DCON, PEST, and GATO will be discussed. The non-ideal MHD version of the code will be employed to study as an application the transport barrier physics in tokamak discharges.

  15. Accumulate repeat accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

  16. Bitplane Image Coding With Parallel Coefficient Processing.

    PubMed

    Auli-Llinas, Francesc; Enfedaque, Pablo; Moure, Juan C; Sanchez, Victor

    2016-01-01

    Image coding systems have been traditionally tailored for multiple instruction, multiple data (MIMD) computing. In general, they partition the (transformed) image in codeblocks that can be coded in the cores of MIMD-based processors. Each core executes a sequential flow of instructions to process the coefficients in the codeblock, independently and asynchronously from the others cores. Bitplane coding is a common strategy to code such data. Most of its mechanisms require sequential processing of the coefficients. The last years have seen the upraising of processing accelerators with enhanced computational performance and power efficiency whose architecture is mainly based on the single instruction, multiple data (SIMD) principle. SIMD computing refers to the execution of the same instruction to multiple data in a lockstep synchronous way. Unfortunately, current bitplane coding strategies cannot fully profit from such processors due to inherently sequential coding task. This paper presents bitplane image coding with parallel coefficient (BPC-PaCo) processing, a coding method that can process many coefficients within a codeblock in parallel and synchronously. To this end, the scanning order, the context formation, the probability model, and the arithmetic coder of the coding engine have been re-formulated. The experimental results suggest that the penalization in coding performance of BPC-PaCo with respect to the traditional strategies is almost negligible.

  17. Pulsed Inductive Thruster (PIT): Modeling and Validation Using the MACH2 Code

    NASA Technical Reports Server (NTRS)

    Schneider, Steven (Technical Monitor); Mikellides, Pavlos G.

    2003-01-01

    Numerical modeling of the Pulsed Inductive Thruster exercising the magnetohydrodynamics code, MACH2 aims to provide bilateral validation of the thruster's measured performance and the code's capability of capturing the pertinent physical processes. Computed impulse values for helium and argon propellants demonstrate excellent correlation to the experimental data for a range of energy levels and propellant-mass values. The effects of the vacuum tank wall and massinjection scheme were investigated to show trivial changes in the overall performance. An idealized model for these energy levels and propellants deduces that the energy expended to the internal energy modes and plasma dissipation processes is independent of the propellant type, mass, and energy level.

  18. The investigation of tethered satellite system dynamics

    NASA Technical Reports Server (NTRS)

    Lorenzini, E. C.

    1986-01-01

    The analysis of the rotational dynamics of the satellite was focused on the rotational amplitude increase of the satellite, with respect to the tether, during retrieval. The dependence of the rotational amplitude upon the tether tension variation to the power 1/4 was thoroughly investigated. The damping of rotational oscillations achievable by reel control was also quantified while an alternative solution that makes use of a lever arm attached with a universal joint to the satellite was proposed. Comparison simulations between the Smithsonian Astrophysical Observatory and the Martin Marietta (MMA) computer code of reteival maneuvers were also carried out. The agreement between the two, completely independent, codes was extremely close, demonstrating the reliability of the models. The slack tether dynamics during reel jams was analytically investigated in order to identify the limits of applicability of the SLACK3 computer code to this particular case. Test runs with SLACK3 were also carried out.

  19. The cosmic code comparison project

    NASA Astrophysics Data System (ADS)

    Heitmann, Katrin; Lukić, Zarija; Fasel, Patricia; Habib, Salman; Warren, Michael S.; White, Martin; Ahrens, James; Ankeny, Lee; Armstrong, Ryan; O'Shea, Brian; Ricker, Paul M.; Springel, Volker; Stadel, Joachim; Trac, Hy

    2008-10-01

    Current and upcoming cosmological observations allow us to probe structures on smaller and smaller scales, entering highly nonlinear regimes. In order to obtain theoretical predictions in these regimes, large cosmological simulations have to be carried out. The promised high accuracy from observations makes the simulation task very demanding: the simulations have to be at least as accurate as the observations. This requirement can only be fulfilled by carrying out an extensive code verification program. The first step of such a program is the comparison of different cosmology codes including gravitational interactions only. In this paper, we extend a recently carried out code comparison project to include five more simulation codes. We restrict our analysis to a small cosmological volume which allows us to investigate properties of halos. For the matter power spectrum and the mass function, the previous results hold, with the codes agreeing at the 10% level over wide dynamic ranges. We extend our analysis to the comparison of halo profiles and investigate the halo count as a function of local density. We introduce and discuss ParaView as a flexible analysis tool for cosmological simulations, the use of which immensely simplifies the code comparison task.

  20. Efficient codes and balanced networks.

    PubMed

    Denève, Sophie; Machens, Christian K

    2016-03-01

    Recent years have seen a growing interest in inhibitory interneurons and their circuits. A striking property of cortical inhibition is how tightly it balances excitation. Inhibitory currents not only match excitatory currents on average, but track them on a millisecond time scale, whether they are caused by external stimuli or spontaneous fluctuations. We review, together with experimental evidence, recent theoretical approaches that investigate the advantages of such tight balance for coding and computation. These studies suggest a possible revision of the dominant view that neurons represent information with firing rates corrupted by Poisson noise. Instead, tight excitatory/inhibitory balance may be a signature of a highly cooperative code, orders of magnitude more precise than a Poisson rate code. Moreover, tight balance may provide a template that allows cortical neurons to construct high-dimensional population codes and learn complex functions of their inputs.

  1. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  2. Manually operated coded switch

    DOEpatents

    Barnette, Jon H.

    1978-01-01

    The disclosure relates to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made.

  3. Parafermion stabilizer codes

    NASA Astrophysics Data System (ADS)

    Güngördü, Utkan; Nepal, Rabindra; Kovalev, Alexey A.

    2014-10-01

    We define and study parafermion stabilizer codes, which can be viewed as generalizations of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions. Parafermion stabilizer codes can protect against low-weight errors acting on a small subset of parafermion modes in analogy to qudit stabilizer codes. Examples of several smallest parafermion stabilizer codes are given. A locality-preserving embedding of qudit operators into parafermion operators is established that allows one to map known qudit stabilizer codes to parafermion codes. We also present a local 2D parafermion construction that combines topological protection of Kitaev's toric code with additional protection relying on parity conservation.

  4. Transionospheric Propagation Code (TIPC)

    SciTech Connect

    Roussel-Dupre, R.; Kelley, T.A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of vhf signals following propagation through the ionosphere. The code is written in Fortran 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, DTOA study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of delta-times-of-arrival (DTOAs) vs TECs for a specified pair of receivers.

  5. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  6. QR Codes 101

    ERIC Educational Resources Information Center

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  7. Code-Switching: L1-Coded Mediation in a Kindergarten Foreign Language Classroom

    ERIC Educational Resources Information Center

    Lin, Zheng

    2012-01-01

    This paper is based on a qualitative inquiry that investigated the role of teachers' mediation in three different modes of coding in a kindergarten foreign language classroom in China (i.e. L2-coded intralinguistic mediation, L1-coded cross-lingual mediation, and L2-and-L1-mixed mediation). Through an exploratory examination of the varying effects…

  8. Coding Strategies and Implementations of Compressive Sensing

    NASA Astrophysics Data System (ADS)

    Tsai, Tsung-Han

    information from a noisy environment. Using engineering efforts to accomplish the same task usually requires multiple detectors, advanced computational algorithms, or artificial intelligence systems. Compressive acoustic sensing incorporates acoustic metamaterials in compressive sensing theory to emulate the abilities of sound localization and selective attention. This research investigates and optimizes the sensing capacity and the spatial sensitivity of the acoustic sensor. The well-modeled acoustic sensor allows localizing multiple speakers in both stationary and dynamic auditory scene; and distinguishing mixed conversations from independent sources with high audio recognition rate.

  9. Fostering Musical Independence

    ERIC Educational Resources Information Center

    Shieh, Eric; Allsup, Randall Everett

    2016-01-01

    Musical independence has always been an essential aim of musical instruction. But this objective can refer to everything from high levels of musical expertise to more student choice in the classroom. While most conceptualizations of musical independence emphasize the demonstration of knowledge and skills within particular music traditions, this…

  10. Independent vs. Laboratory Papers.

    ERIC Educational Resources Information Center

    Wilson, Clint C., II

    1981-01-01

    Comparisons of independent and laboratory newspapers at selected California colleges indicated that (1) the independent newspapers were superior in editorial opinion and leadership characteristics; (2) the laboratory newspapers made better use of photography, art, and graphics; and (3) professional journalists highly rated their laboratory…

  11. American Independence. Fifth Grade.

    ERIC Educational Resources Information Center

    Crosby, Annette

    This fifth grade teaching unit covers early conflicts between the American colonies and Britain, battles of the American Revolutionary War, and the Declaration of Independence. Knowledge goals address the pre-revolutionary acts enforced by the British, the concepts of conflict and independence, and the major events and significant people from the…

  12. NERO- a post-maximum supernova radiation transport code

    NASA Astrophysics Data System (ADS)

    Maurer, I.; Jerkstrand, A.; Mazzali, P. A.; Taubenberger, S.; Hachinger, S.; Kromer, M.; Sim, S.; Hillebrandt, W.

    2011-12-01

    The interpretation of supernova (SN) spectra is essential for deriving SN ejecta properties such as density and composition, which in turn can tell us about their progenitors and the explosion mechanism. A very large number of atomic processes are important for spectrum formation. Several tools for calculating SN spectra exist, but they mainly focus on the very early or late epochs. The intermediate phase, which requires a non-local thermodynamic equilibrium (NLTE) treatment of radiation transport has rarely been studied. In this paper, we present a new SN radiation transport code, NERO, which can look at those epochs. All the atomic processes are treated in full NLTE, under a steady-state assumption. This is a valid approach between roughly 50 and 500 days after the explosion depending on SN type. This covers the post-maximum photospheric and the early and the intermediate nebular phase. As a test, we compare NERO to the radiation transport code of Jerkstrand, Fransson & Kozma and to the nebular code of Mazzali et al. All three codes have been developed independently and a comparison provides a valuable opportunity to investigate their reliability. Currently, NERO is one-dimensional and can be used for predicting spectra of synthetic explosion models or for deriving SN properties by spectral modelling. To demonstrate this, we study the spectra of the 'normal' Type Ia supernova (SN Ia) 2005cf between 50 and 350 days after the explosion and identify most of the common SN Ia line features at post-maximum epochs.

  13. Decoding and synchronization of error correcting codes

    NASA Astrophysics Data System (ADS)

    Madkour, S. A.

    1983-01-01

    Decoding devices for hard quantization and soft decision error correcting codes are discussed. A Meggit decoder for Reed-Solomon polynominal codes was implemented and tested. It uses 49 TTL logic IC. A maximum binary frequency of 30 Mbit/sec is demonstrated. A soft decision decoding approach was applied to hard decision decoding, using the principles of threshold decoding. Simulation results indicate that the proposed schema achieves satisfactory performance using only a small number of parity checks. The combined correction of substitution and synchronization errors is analyzed. The algorithm presented shows the capability of convolutional codes to correct synchronization errors as well as independent additive errors without any additional redundancy.

  14. Certifying Auto-Generated Flight Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen

    2008-01-01

    Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm

  15. Subsystem codes with spatially local generators

    SciTech Connect

    Bravyi, Sergey

    2011-01-15

    We study subsystem codes whose gauge group has local generators in two-dimensional (2D) geometry. It is shown that there exists a family of such codes defined on lattices of size LxL with the number of logical qubits k and the minimum distance d both proportional to L. The gauge group of these codes involves only two-qubit generators of type XX and ZZ coupling nearest-neighbor qubits (and some auxiliary one-qubit generators). Our proof is not constructive as it relies on a certain version of the Gilbert-Varshamov bound for classical codes. Along the way, we introduce and study properties of generalized Bacon-Shor codes that might be of independent interest. Secondly, we prove that any 2D subsystem [n,k,d] code with spatially local generators obeys upper bounds kd=O(n) and d{sup 2}=O(n). The analogous upper bound proved recently for 2D stabilizer codes is kd{sup 2}=O(n). Our results thus demonstrate that subsystem codes can be more powerful than stabilizer codes under the spatial locality constraint.

  16. Nonbinary Quantum Convolutional Codes Derived from Negacyclic Codes

    NASA Astrophysics Data System (ADS)

    Chen, Jianzhang; Li, Jianping; Yang, Fan; Huang, Yuanyuan

    2015-01-01

    In this paper, some families of nonbinary quantum convolutional codes are constructed by using negacyclic codes. These nonbinary quantum convolutional codes are different from quantum convolutional codes in the literature. Moreover, we construct a family of optimal quantum convolutional codes.

  17. Coded continuous wave meteor radar

    NASA Astrophysics Data System (ADS)

    Vierinen, Juha; Chau, Jorge L.; Pfeffer, Nico; Clahsen, Matthias; Stober, Gunter

    2016-03-01

    The concept of a coded continuous wave specular meteor radar (SMR) is described. The radar uses a continuously transmitted pseudorandom phase-modulated waveform, which has several advantages compared to conventional pulsed SMRs. The coding avoids range and Doppler aliasing, which are in some cases problematic with pulsed radars. Continuous transmissions maximize pulse compression gain, allowing operation at lower peak power than a pulsed system. With continuous coding, the temporal and spectral resolution are not dependent on the transmit waveform and they can be fairly flexibly changed after performing a measurement. The low signal-to-noise ratio before pulse compression, combined with independent pseudorandom transmit waveforms, allows multiple geographically separated transmitters to be used in the same frequency band simultaneously without significantly interfering with each other. Because the same frequency band can be used by multiple transmitters, the same interferometric receiver antennas can be used to receive multiple transmitters at the same time. The principles of the signal processing are discussed, in addition to discussion of several practical ways to increase computation speed, and how to optimally detect meteor echoes. Measurements from a campaign performed with a coded continuous wave SMR are shown and compared with two standard pulsed SMR measurements. The type of meteor radar described in this paper would be suited for use in a large-scale multi-static network of meteor radar transmitters and receivers. Such a system would be useful for increasing the number of meteor detections to obtain improved meteor radar data products.

  18. Groundwater flow code verification ``benchmarking`` activity (COVE-2A): Analysis of participants` work

    SciTech Connect

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project.

  19. Asymmetric quantum convolutional codes

    NASA Astrophysics Data System (ADS)

    La Guardia, Giuliano G.

    2016-01-01

    In this paper, we construct the first families of asymmetric quantum convolutional codes (AQCCs). These new AQCCs are constructed by means of the CSS-type construction applied to suitable families of classical convolutional codes, which are also constructed here. The new codes have non-catastrophic generator matrices, and they have great asymmetry. Since our constructions are performed algebraically, i.e. we develop general algebraic methods and properties to perform the constructions, it is possible to derive several families of such codes and not only codes with specific parameters. Additionally, several different types of such codes are obtained.

  20. Independent Replication and Meta-Analysis for Endometriosis Risk Loci.

    PubMed

    Sapkota, Yadav; Fassbender, Amelie; Bowdler, Lisa; Fung, Jenny N; Peterse, Daniëlle; O, Dorien; Montgomery, Grant W; Nyholt, Dale R; D'Hooghe, Thomas M

    2015-10-01

    Endometriosis is a complex disease that affects 6-10% of women in their reproductive years and 20-50% of women with infertility. Genome-wide and candidate-gene association studies for endometriosis have identified 10 independent risk loci, and of these, nine (rs7521902, rs13394619, rs4141819, rs6542095, rs1519761, rs7739264, rs12700667, rs1537377, and rs10859871) are polymorphic in European populations. Here we investigate the replication of nine SNP loci in 998 laparoscopically and histologically confirmed endometriosis cases and 783 disease-free controls from Belgium. SNPs rs7521902, rs13394619, and rs6542095 show nominally significant (p < .05) associations with endometriosis, while the directions of effect for seven SNPs are consistent with the original reports. Association of rs6542095 at the IL1A locus with 'All' (p = .066) and 'Grade_B' (p = .01) endometriosis is noteworthy because this is the first successful replication in an independent population. Meta-analysis with the published results yields genome-wide significant evidence for rs7521902, rs13394619, rs6542095, rs12700667, rs7739264, and rs1537377. Notably, three coding variants in GREB1 (near rs13394619) and CDKN2B-AS1 (near rs1537377) also showed nominally significant associations with endometriosis. Overall, this study provides important replication in a uniquely characterized independent population, and indicates that the majority of the original genome-wide association findings are not due to chance alone.

  1. Data Machine Independence

    1994-12-30

    Data-machine independence achieved by using four technologies (ASN.1, XDR, SDS, and ZEBRA) has been evaluated by encoding two different applications in each of the above; and their results compared against the standard programming method using C.

  2. Media independent interface

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The work done on the Media Independent Interface (MII) Interface Control Document (ICD) program is described and recommendations based on it were made. Explanations and rationale for the content of the ICD itself are presented.

  3. Steroid-independent male sexual behavior in B6D2F2 male mice.

    PubMed

    McInnis, Christine M; Venu, Samitha; Park, Jin Ho

    2016-09-01

    It is well established that male sexual behavior (MSB) is regulated by gonadal steroids; however, individual differences in MSB, independent of gonadal steroids, are prevalent across a wide range of species, and further investigation is necessary to advance our understanding of steroid-independent MSB. Studies utilizing B6D2F1 hybrid male mice in which a significant proportion retain MSB after long-term orchidectomy, identified as steroid-independent-maters (SI-maters), have begun to unravel the genetic underpinnings of steroid-independent MSB. A recent study demonstrated that steroid-independent MSB is a heritable behavioral phenotype that is mainly passed down from B6D2F1 hybrid SI-maters when crossed with C57BL6J female mice. To begin to uncover whether the strain of the dam plays a role in the inheritance of steroid-independent MSB, B6D2F1 hybrid females were crossed with B6D2F1 hybrid males. While the present study confirms the finding that steroid-independent MSB is a heritable behavioral phenotype and that SI-mater sires are more likely to pass down some components of MSB than SI-non-maters to their offspring, it also reveals that the B6D2F2 male offspring that were identified as SI-maters that displayed the full repertoire of steroid-independent MSB had the same probability of being sired from either a B6D2F1 SI-mater or SI-non-mater. These results, in conjunction with previous findings, indicate that the specific chromosomal loci pattern that codes for steroid-independent MSB in the B6D2F2 male offspring may result regardless of whether the father was a SI-mater or SI-non-mater, and that the maternal strain may be an important factor in the inheritance of steroid-independent MSB. PMID:27476435

  4. Reviewing the Challenges and Opportunities Presented by Code Switching and Mixing in Bangla

    ERIC Educational Resources Information Center

    Hasan, Md. Kamrul; Akhand, Mohd. Moniruzzaman

    2015-01-01

    This paper investigates the issues related to code-switching/code-mixing in an ESL context. Some preliminary data on Bangla-English code-switching/code-mixing has been analyzed in order to determine which structural pattern of code-switching/code-mixing is predominant in different social strata. This study also explores the relationship of…

  5. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-02-20

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  6. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-01-01

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  7. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  8. EMF wire code research

    SciTech Connect

    Jones, T.

    1993-11-01

    This paper examines the results of previous wire code research to determines the relationship with childhood cancer, wire codes and electromagnetic fields. The paper suggests that, in the original Savitz study, biases toward producing a false positive association between high wire codes and childhood cancer were created by the selection procedure.

  9. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  10. Progress in The Semantic Analysis of Scientific Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  11. Coding for Electronic Mail

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  12. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  13. DLLExternalCode

    SciTech Connect

    Greg Flach, Frank Smith

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  14. DLLExternalCode

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read frommore » files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.« less

  15. Parafermion stabilizer codes

    NASA Astrophysics Data System (ADS)

    Gungordu, Utkan; Nepal, Rabindra; Kovalev, Alexey

    2015-03-01

    We define and study parafermion stabilizer codes [Phys. Rev. A 90, 042326 (2014)] which can be viewed as generalizations of Kitaev's one dimensional model of unpaired Majorana fermions. Parafermion stabilizer codes can protect against low-weight errors acting on a small subset of parafermion modes in analogy to qudit stabilizer codes. Examples of several smallest parafermion stabilizer codes are given. Our results show that parafermions can achieve a better encoding rate than Majorana fermions. A locality preserving embedding of qudit operators into parafermion operators is established which allows one to map known qudit stabilizer codes to parafermion codes. We also present a local 2D parafermion construction that combines topological protection of Kitaev's toric code with additional protection relying on parity conservation. This work was supported in part by the NSF under Grants No. Phy-1415600 and No. NSF-EPSCoR 1004094.

  16. Applications of Coding in Network Communications

    ERIC Educational Resources Information Center

    Chang, Christopher SungWook

    2012-01-01

    This thesis uses the tool of network coding to investigate fast peer-to-peer file distribution, anonymous communication, robust network construction under uncertainty, and prioritized transmission. In a peer-to-peer file distribution system, we use a linear optimization approach to show that the network coding framework significantly simplifies…

  17. Source Code Plagiarism--A Student Perspective

    ERIC Educational Resources Information Center

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  18. Independent NOAA considered

    NASA Astrophysics Data System (ADS)

    Richman, Barbara T.

    A proposal to pull the National Oceanic and Atmospheric Administration (NOAA) out of the Department of Commerce and make it an independent agency was the subject of a recent congressional hearing. Supporters within the science community and in Congress said that an independent NOAA will benefit by being more visible and by not being tied to a cabinet-level department whose main concerns lie elsewhere. The proposal's critics, however, cautioned that making NOAA independent could make it even more vulnerable to the budget axe and would sever the agency's direct access to the President.The separation of NOAA from Commerce was contained in a June 1 proposal by President Ronald Reagan that also called for all federal trade functions under the Department of Commerce to be reorganized into a new Department of International Trade and Industry (DITI).

  19. Independent technical review, handbook

    SciTech Connect

    Not Available

    1994-02-01

    Purpose Provide an independent engineering review of the major projects being funded by the Department of Energy, Office of Environmental Restoration and Waste Management. The independent engineering review will address questions of whether the engineering practice is sufficiently developed to a point where a major project can be executed without significant technical problems. The independent review will focus on questions related to: (1) Adequacy of development of the technical base of understanding; (2) Status of development and availability of technology among the various alternatives; (3) Status and availability of the industrial infrastructure to support project design, equipment fabrication, facility construction, and process and program/project operation; (4) Adequacy of the design effort to provide a sound foundation to support execution of project; (5) Ability of the organization to fully integrate the system, and direct, manage, and control the execution of a complex major project.

  20. Astronomy education and the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, Robert J.

    2016-01-01

    The Astrophysics Source Code Library (ASCL) is an online registry of source codes used in refereed astrophysics research. It currently lists nearly 1,200 codes and covers all aspects of computational astrophysics. How can this resource be of use to educators and to the graduate students they mentor? The ASCL serves as a discovery tool for codes that can be used for one's own research. Graduate students can also investigate existing codes to see how common astronomical problems are approached numerically in practice, and use these codes as benchmarks for their own solutions to these problems. Further, they can deepen their knowledge of software practices and techniques through examination of others' codes.

  1. On the role of code comparisons in verification and validation.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2003-08-01

    This report presents a perspective on the role of code comparison activities in verification and validation. We formally define the act of code comparison as the Code Comparison Principle (CCP) and investigate its application in both verification and validation. One of our primary conclusions is that the use of code comparisons for validation is improper and dangerous. We also conclude that while code comparisons may be argued to provide a beneficial component in code verification activities, there are higher quality code verification tasks that should take precedence. Finally, we provide a process for application of the CCP that we believe is minimal for achieving benefit in verification processes.

  2. In Vivo Imaging Reveals Composite Coding for Diagonal Motion in the Drosophila Visual System

    PubMed Central

    Zhou, Wei; Chang, Jin

    2016-01-01

    Understanding information coding is important for resolving the functions of visual neural circuits. The motion vision system is a classic model for studying information coding as it contains a concise and complete information-processing circuit. In Drosophila, the axon terminals of motion-detection neurons (T4 and T5) project to the lobula plate, which comprises four regions that respond to the four cardinal directions of motion. The lobula plate thus represents a topographic map on a transverse plane. This enables us to study the coding of diagonal motion by investigating its response pattern. By using in vivo two-photon calcium imaging, we found that the axon terminals of T4 and T5 cells in the lobula plate were activated during diagonal motion. Further experiments showed that the response to diagonal motion is distributed over the following two regions compared to the cardinal directions of motion—a diagonal motion selective response region and a non-selective response region—which overlap with the response regions of the two vector-correlated cardinal directions of motion. Interestingly, the sizes of the non-selective response regions are linearly correlated with the angle of the diagonal motion. These results revealed that the Drosophila visual system employs a composite coding for diagonal motion that includes both independent coding and vector decomposition coding. PMID:27695103

  3. Molecular cloning of canine co-chaperone small glutamine-rich tetratricopeptide repeat-containing protein α (SGTA) and investigation of its ability to suppress androgen receptor signalling in androgen-independent prostate cancer.

    PubMed

    Kato, Yuiko; Ochiai, Kazuhiko; Michishita, Masaki; Azakami, Daigo; Nakahira, Rei; Morimatsu, Masami; Ishiguro-Oonuma, Toshina; Yoshikawa, Yasunaga; Kobayashi, Masato; Bonkobara, Makoto; Kobayashi, Masanori; Takahashi, Kimimasa; Watanabe, Masami; Omi, Toshinori

    2015-11-01

    Although the morbidity of canine prostate cancer is low, the majority of cases present with resistance to androgen therapy and poor clinical outcomes. These pathological conditions are similar to the signs of the terminal stage of human androgen-independent prostate cancer. The co-chaperone small glutamine-rich tetratricopeptide repeat-containing protein α (SGTA) is known to be overexpressed in human androgen-independent prostate cancer. However, there is little information about the structure and function of canine SGTA. In this study, canine SGTA was cloned and analysed for its ability to suppress androgen receptor signalling. The full-length open reading frame (ORF) of the canine SGTA gene was amplified by RT-PCR using primers designed from canine-expressed sequence tags that were homologous to human SGTA. The canine SGTA ORF has high homology with the corresponding human (89%) and mouse (81%) sequences. SGTA dimerisation region and tetratricopeptide repeat (TPR) domains are conserved across the three species. The ability of canine SGTA to undergo homodimerisation was demonstrated by a mammalian two-hybrid system and a pull-down assay. The negative impact of canine SGTA on androgen receptor (AR) signalling was demonstrated using a reporter assay in androgen-independent human prostate cancer cell lines. Pathological analysis showed overexpression of SGTA in canine prostate cancer, but not in hyperplasia. A reporter assay in prostate cells demonstrated suppression of AR signalling by canine SGTA. Altogether, these results suggest that canine SGTA may play an important role in the acquisition of androgen independence by canine prostate cancer cells.

  4. Non-Coding RNAs in Cardiac Aging.

    PubMed

    Wang, Hui; Bei, Yihua; Shi, Jing; Xiao, Junjie; Kong, Xiangqing

    2015-01-01

    Aging has a remarkable impact on the function of the heart, and is independently associated with increased risk for cardiovascular diseases. Cardiac aging is an intrinsic physiological process that results in impaired cardiac function, along with lots of cellular and molecular changes. Non-coding RNAs include small transcripts, such as microRNAs and a wide range of long non-coding RNAs (lncRNAs). Emerging evidence has revealed that non-coding RNAs acted as powerful and dynamic modifiers of cardiac aging. This review aims to provide a general overview of non-coding RNAs implicated in cardiac aging, and the underlying mechanisms involved in maintaining homeo-stasis and retarding aging.

  5. Independence and Survival.

    ERIC Educational Resources Information Center

    James, H. Thomas

    Independent schools that are of viable size, well managed, and strategically located to meet competition will survive and prosper past the current financial crisis. We live in a complex technological society with insatiable demands for knowledgeable people to keep it running. The future will be marked by the orderly selection of qualified people,…

  6. Independence, Disengagement, and Discipline

    ERIC Educational Resources Information Center

    Rubin, Ron

    2012-01-01

    School disengagement is linked to a lack of opportunities for students to fulfill their needs for independence and self-determination. Young people have little say about what, when, where, and how they will learn, the criteria used to assess their success, and the content of school and classroom rules. Traditional behavior management discourages…

  7. Caring about Independent Lives

    ERIC Educational Resources Information Center

    Christensen, Karen

    2010-01-01

    With the rhetoric of independence, new cash for care systems were introduced in many developed welfare states at the end of the 20th century. These systems allow local authorities to pay people who are eligible for community care services directly, to enable them to employ their own careworkers. Despite the obvious importance of the careworker's…

  8. Postcard from Independence, Mo.

    ERIC Educational Resources Information Center

    Archer, Jeff

    2004-01-01

    This article reports results showing that the Independence, Missori school district failed to meet almost every one of its improvement goals under the No Child Left Behind Act. The state accreditation system stresses improvement over past scores, while the federal law demands specified amounts of annual progress toward the ultimate goal of 100…

  9. Independent School Governance.

    ERIC Educational Resources Information Center

    Beavis, Allan K.

    Findings of a study that examined the role of the governing body in the independent school's self-renewing processes are presented in this paper. From the holistic paradigm, the school is viewed as a self-renewing system that is able to maintain its identity despite environmental changes through existing structures that define and create…

  10. cncRNAs: Bi-functional RNAs with protein coding and non-coding functions

    PubMed Central

    Kumari, Pooja; Sampath, Karuna

    2015-01-01

    For many decades, the major function of mRNA was thought to be to provide protein-coding information embedded in the genome. The advent of high-throughput sequencing has led to the discovery of pervasive transcription of eukaryotic genomes and opened the world of RNA-mediated gene regulation. Many regulatory RNAs have been found to be incapable of protein coding and are hence termed as non-coding RNAs (ncRNAs). However, studies in recent years have shown that several previously annotated non-coding RNAs have the potential to encode proteins, and conversely, some coding RNAs have regulatory functions independent of the protein they encode. Such bi-functional RNAs, with both protein coding and non-coding functions, which we term as ‘cncRNAs’, have emerged as new players in cellular systems. Here, we describe the functions of some cncRNAs identified from bacteria to humans. Because the functions of many RNAs across genomes remains unclear, we propose that RNAs be classified as coding, non-coding or both only after careful analysis of their functions. PMID:26498036

  11. Industrial Code Development

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1991-01-01

    The industrial codes will consist of modules of 2-D and simplified 2-D or 1-D codes, intended for expeditious parametric studies, analysis, and design of a wide variety of seals. Integration into a unified system is accomplished by the industrial Knowledge Based System (KBS), which will also provide user friendly interaction, contact sensitive and hypertext help, design guidance, and an expandable database. The types of analysis to be included with the industrial codes are interfacial performance (leakage, load, stiffness, friction losses, etc.), thermoelastic distortions, and dynamic response to rotor excursions. The first three codes to be completed and which are presently being incorporated into the KBS are the incompressible cylindrical code, ICYL, and the compressible cylindrical code, GCYL.

  12. Updating the Read Codes

    PubMed Central

    Robinson, David; Comp, Dip; Schulz, Erich; Brown, Philip; Price, Colin

    1997-01-01

    Abstract The Read Codes are a hierarchically-arranged controlled clinical vocabulary introduced in the early 1980s and now consisting of three maintained versions of differing complexity. The code sets are dynamic, and are updated quarterly in response to requests from users including clinicians in both primary and secondary care, software suppliers, and advice from a network of specialist healthcare professionals. The codes' continual evolution of content, both across and within versions, highlights tensions between different users and uses of coded clinical data. Internal processes, external interactions and new structural features implemented by the NHS Centre for Coding and Classification (NHSCCC) for user interactive maintenance of the Read Codes are described, and over 2000 items of user feedback episodes received over a 15-month period are analysed. PMID:9391934

  13. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  14. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  15. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  16. Doubled Color Codes

    NASA Astrophysics Data System (ADS)

    Bravyi, Sergey

    Combining protection from noise and computational universality is one of the biggest challenges in the fault-tolerant quantum computing. Topological stabilizer codes such as the 2D surface code can tolerate a high level of noise but implementing logical gates, especially non-Clifford ones, requires a prohibitively large overhead due to the need of state distillation. In this talk I will describe a new family of 2D quantum error correcting codes that enable a transversal implementation of all logical gates required for the universal quantum computing. Transversal logical gates (TLG) are encoded operations that can be realized by applying some single-qubit rotation to each physical qubit. TLG are highly desirable since they introduce no overhead and do not spread errors. It has been known before that a quantum code can have only a finite number of TLGs which rules out computational universality. Our scheme circumvents this no-go result by combining TLGs of two different quantum codes using the gauge-fixing method pioneered by Paetznick and Reichardt. The first code, closely related to the 2D color code, enables a transversal implementation of all single-qubit Clifford gates such as the Hadamard gate and the π / 2 phase shift. The second code that we call a doubled color code provides a transversal T-gate, where T is the π / 4 phase shift. The Clifford+T gate set is known to be computationally universal. The two codes can be laid out on the honeycomb lattice with two qubits per site such that the code conversion requires parity measurements for six-qubit Pauli operators supported on faces of the lattice. I will also describe numerical simulations of logical Clifford+T circuits encoded by the distance-3 doubled color code. Based on a joint work with Andrew Cross.

  17. FAA Smoke Transport Code

    SciTech Connect

    Domino, Stefan; Luketa-Hanlin, Anay; Gallegos, Carlos

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a code obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.

  18. Bar Code Labels

    NASA Technical Reports Server (NTRS)

    1988-01-01

    American Bar Codes, Inc. developed special bar code labels for inventory control of space shuttle parts and other space system components. ABC labels are made in a company-developed anodizing aluminum process and consecutively marketed with bar code symbology and human readable numbers. They offer extreme abrasion resistance and indefinite resistance to ultraviolet radiation, capable of withstanding 700 degree temperatures without deterioration and up to 1400 degrees with special designs. They offer high resistance to salt spray, cleaning fluids and mild acids. ABC is now producing these bar code labels commercially or industrial customers who also need labels to resist harsh environments.

  19. Hardware independence checkout software

    NASA Technical Reports Server (NTRS)

    Cameron, Barry W.; Helbig, H. R.

    1990-01-01

    ACSI has developed a program utilizing CLIPS to assess compliance with various programming standards. Essentially the program parses C code to extract the names of all function calls. These are asserted as CLIPS facts which also include information about line numbers, source file names, and called functions. Rules have been devised to establish functions called that have not been defined in any of the source parsed. These are compared against lists of standards (represented as facts) using rules that check intersections and/or unions of these. By piping the output into other processes the source is appropriately commented by generating and executing parsed scripts.

  20. Agent independent task planning

    NASA Technical Reports Server (NTRS)

    Davis, William S.

    1990-01-01

    Agent-Independent Planning is a technique that allows the construction of activity plans without regard to the agent that will perform them. Once generated, a plan is then validated and translated into instructions for a particular agent, whether a robot, crewmember, or software-based control system. Because Space Station Freedom (SSF) is planned for orbital operations for approximately thirty years, it will almost certainly experience numerous enhancements and upgrades, including upgrades in robotic manipulators. Agent-Independent Planning provides the capability to construct plans for SSF operations, independent of specific robotic systems, by combining techniques of object oriented modeling, nonlinear planning and temporal logic. Since a plan is validated using the physical and functional models of a particular agent, new robotic systems can be developed and integrated with existing operations in a robust manner. This technique also provides the capability to generate plans for crewmembers with varying skill levels, and later apply these same plans to more sophisticated robotic manipulators made available by evolutions in technology.

  1. International exploration by independents

    SciTech Connect

    Bertagne, R.G. )

    1991-03-01

    Recent industry trends indicate that the smaller US independents are looking at foreign exploration opportunities as one of the alternatives for growth in the new age of exploration. It is usually accepted that foreign finding costs per barrel are substantially lower than domestic because of the large reserve potential of international plays. To get involved overseas requires, however, an adaptation to different cultural, financial, legal, operational, and political conditions. Generally foreign exploration proceeds at a slower pace than domestic because concessions are granted by the government, or are explored in partnership with the national oil company. First, a mid- to long-term strategy, tailored to the goals and the financial capabilities of the company, must be prepared; it must be followed by an ongoing evaluation of quality prospects in various sedimentary basins, and a careful planning and conduct of the operations. To successfully explore overseas also requires the presence on the team of a minimum number of explorationists and engineers thoroughly familiar with the various exploratory and operational aspects of foreign work, having had a considerable amount of onsite experience in various geographical and climatic environments. Independents that are best suited for foreign expansion are those that have been financially successful domestically, and have a good discovery track record. When properly approached foreign exploration is well within the reach of smaller US independents and presents essentially no greater risk than domestic exploration; the reward, however, can be much larger and can catapult the company into the big leagues.

  2. International exploration by independent

    SciTech Connect

    Bertragne, R.G.

    1992-04-01

    Recent industry trends indicate that the smaller U.S. independents are looking at foreign exploration opportunities as one of the alternatives for growth in the new age of exploration. Foreign finding costs per barrel usually are accepted to be substantially lower than domestic costs because of the large reserve potential of international plays. To get involved in overseas exploration, however, requires the explorationist to adapt to different cultural, financial, legal, operational, and political conditions. Generally, foreign exploration proceeds at a slower pace than domestic exploration because concessions are granted by a country's government, or are explored in partnership with a national oil company. First, the explorationist must prepare a mid- to long-term strategy, tailored to the goals and the financial capabilities of the company; next, is an ongoing evaluation of quality prospects in various sedimentary basins, and careful planning and conduct of the operations. To successfully explore overseas also requires the presence of a minimum number of explorationists and engineers thoroughly familiar with the various exploratory and operational aspects of foreign work. Ideally, these team members will have had a considerable amount of on-site experience in various countries and climates. Independents best suited for foreign expansion are those who have been financially successful in domestic exploration. When properly approached, foreign exploration is well within the reach of smaller U.S. independents, and presents essentially no greater risk than domestic exploration; however, the reward can be much larger and can catapult the company into the 'big leagues.'

  3. Independence and symbolic independence of nonstationary heartbeat series during atrial fibrillation

    NASA Astrophysics Data System (ADS)

    Cammarota, Camillo; Rogora, Enrico

    2005-08-01

    Heartbeat intervals during atrial fibrillation are commonly believed to form a series of almost independent variables. The series extracted from 24 h Holter recordings show a nonstationary behavior. Because of nonstationarity it is difficult to give a quantitative measure of independence. In this paper, we use and compare two methods for this. The first is a classical method which models a nonstationary series using a linear Gaussian state space model. In this framework, the independence is tested on the stationary sequence of the residuals. The second method codes data into permutations and tests the uniformity of their distribution. This test assumes as null hypothesis a weaker form of independence which we call symbolic independence. We discuss some advantages of symbolic independence in the context of heartbeat series. We analyze the time series of heartbeat intervals from 24 h Holter recordings of nine subjects with chronic atrial fibrillation and find that the detrended series is a zero or one memory process for 83% of regular segments and is symbolically independent for 93% of segments.

  4. Bit-wise arithmetic coding for data compression

    NASA Technical Reports Server (NTRS)

    Kiely, A. B.

    1994-01-01

    This article examines the problem of compressing a uniformly quantized independent and identically distributed (IID) source. We present a new compression technique, bit-wise arithmetic coding, that assigns fixed-length codewords to the quantizer output and uses arithmetic coding to compress the codewords, treating the codeword bits as independent. We examine the performance of this method and evaluate the overhead required when used block-adaptively. Simulation results are presented for Gaussian and Laplacian sources. This new technique could be used as the entropy coder in a transform or subband coding system.

  5. Research on universal combinatorial coding.

    PubMed

    Lu, Jun; Zhang, Zhuo; Mo, Juan

    2014-01-01

    The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value.

  6. Research on universal combinatorial coding.

    PubMed

    Lu, Jun; Zhang, Zhuo; Mo, Juan

    2014-01-01

    The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value. PMID:24772019

  7. Utilizing GPUs to Accelerate Turbomachinery CFD Codes

    NASA Technical Reports Server (NTRS)

    MacCalla, Weylin; Kulkarni, Sameer

    2016-01-01

    GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.

  8. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    PubMed Central

    Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions. PMID:26999741

  9. Code of Ethics

    ERIC Educational Resources Information Center

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  10. Lichenase and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2000-08-15

    The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

  11. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  12. Azerbaijani-Russian Code-Switching and Code-Mixing: Form, Function, and Identity

    ERIC Educational Resources Information Center

    Zuercher, Kenneth

    2009-01-01

    From incorporation into the Russian Empire in 1828, through the collapse of the U.S.S.R. in 1991 governmental language policies and other socio/political forces influenced the Turkic population of the Republic of Azerbaijan to speak Russian. Even with changes since independence Russian use--including various kinds of code-switching and…

  13. P-code enhanced method for processing encrypted GPS signals without knowledge of the encryption code

    NASA Technical Reports Server (NTRS)

    Meehan, Thomas K. (Inventor); Thomas, Jr., Jess Brooks (Inventor); Young, Lawrence E. (Inventor)

    2000-01-01

    In the preferred embodiment, an encrypted GPS signal is down-converted from RF to baseband to generate two quadrature components for each RF signal (L1 and L2). Separately and independently for each RF signal and each quadrature component, the four down-converted signals are counter-rotated with a respective model phase, correlated with a respective model P code, and then successively summed and dumped over presum intervals substantially coincident with chips of the respective encryption code. Without knowledge of the encryption-code signs, the effect of encryption-code sign flips is then substantially reduced by selected combinations of the resulting presums between associated quadrature components for each RF signal, separately and independently for the L1 and L2 signals. The resulting combined presums are then summed and dumped over longer intervals and further processed to extract amplitude, phase and delay for each RF signal. Precision of the resulting phase and delay values is approximately four times better than that obtained from straight cross-correlation of L1 and L2. This improved method provides the following options: separate and independent tracking of the L1-Y and L2-Y channels; separate and independent measurement of amplitude, phase and delay L1-Y channel; and removal of the half-cycle ambiguity in L1-Y and L2-Y carrier phase.

  14. Combustion chamber analysis code

    NASA Astrophysics Data System (ADS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-05-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  15. Combustion chamber analysis code

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-01-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  16. Energy Conservation Code Decoded

    SciTech Connect

    Cole, Pam C.; Taylor, Zachary T.

    2006-09-01

    Designing an energy-efficient, affordable, and comfortable home is a lot easier thanks to a slime, easier to read booklet, the 2006 International Energy Conservation Code (IECC), published in March 2006. States, counties, and cities have begun reviewing the new code as a potential upgrade to their existing codes. Maintained under the public consensus process of the International Code Council, the IECC is designed to do just what its title says: promote the design and construction of energy-efficient homes and commercial buildings. Homes in this case means traditional single-family homes, duplexes, condominiums, and apartment buildings having three or fewer stories. The U.S. Department of Energy, which played a key role in proposing the changes that resulted in the new code, is offering a free training course that covers the residential provisions of the 2006 IECC.

  17. Evolving genetic code

    PubMed Central

    OHAMA, Takeshi; INAGAKI, Yuji; BESSHO, Yoshitaka; OSAWA, Syozo

    2008-01-01

    In 1985, we reported that a bacterium, Mycoplasma capricolum, used a deviant genetic code, namely UGA, a “universal” stop codon, was read as tryptophan. This finding, together with the deviant nuclear genetic codes in not a few organisms and a number of mitochondria, shows that the genetic code is not universal, and is in a state of evolution. To account for the changes in codon meanings, we proposed the codon capture theory stating that all the code changes are non-disruptive without accompanied changes of amino acid sequences of proteins. Supporting evidence for the theory is presented in this review. A possible evolutionary process from the ancient to the present-day genetic code is also discussed. PMID:18941287

  18. Cary Potter on Independent Education

    ERIC Educational Resources Information Center

    Potter, Cary

    1978-01-01

    Cary Potter was President of the National Association of Independent Schools from 1964-1978. As he leaves NAIS he gives his views on education, on independence, on the independent school, on public responsibility, on choice in a free society, on educational change, and on the need for collective action by independent schools. (Author/RK)

  19. Myth or Truth: Independence Day.

    ERIC Educational Resources Information Center

    Gardner, Traci

    Most Americans think of the Fourth of July as Independence Day, but is it really the day the U.S. declared and celebrated independence? By exploring myths and truths surrounding Independence Day, this lesson asks students to think critically about commonly believed stories regarding the beginning of the Revolutionary War and the Independence Day…

  20. Quantum convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Yan, Tingsu; Huang, Xinmei; Tang, Yuansheng

    2014-12-01

    In this paper, three families of quantum convolutional codes are constructed. The first one and the second one can be regarded as a generalization of Theorems 3, 4, 7 and 8 [J. Chen, J. Li, F. Yang and Y. Huang, Int. J. Theor. Phys., doi:10.1007/s10773-014-2214-6 (2014)], in the sense that we drop the constraint q ≡ 1 (mod 4). Furthermore, the second one and the third one attain the quantum generalized Singleton bound.

  1. Value of Laboratory Experiments for Code Validations

    SciTech Connect

    Wawersik, W.R.

    1998-12-14

    Numerical codes have become indispensable for designing underground structures and interpretating the behavior of geologic systems. Because of the complexities of geologic systems, however, code calculations often are associated with large quantitative uncertainties. This papers presents three examples to demonstrate the value of laboratory(or bench scale) experiments to evaluate the predictive capabilities of such codes with five major conclusions: Laboratory or bench-scale experiments are a very cost-effective, controlled means of evaluating and validating numerical codes, not instead of but before or at least concurrent with the implementation of in situ studies. The design of good laboratory validation tests must identifj what aspects of a code are to be scrutinized in order to optimize the size, geometry, boundary conditions, and duration of the experiments. The design of good and sometimes difficult numerical analyses and sensitivity studies. Laboratory validation tests must involve: Good validation experiments will generate independent data sets to identify the combined effect of constitutive models, model generalizations, material parameters, and numerical algorithms. Successfid validations of numerical codes mandate a close collaboration between experimentalists and analysts drawing from the full gamut of observations, measurements, and mathematical results.

  2. Distributed transform coding via source-splitting

    NASA Astrophysics Data System (ADS)

    Yahampath, Pradeepa

    2012-12-01

    Transform coding (TC) is one of the best known practical methods for quantizing high-dimensional vectors. In this article, a practical approach to distributed TC of jointly Gaussian vectors is presented. This approach, referred to as source-split distributed transform coding (SP-DTC), can be used to easily implement two terminal transform codes for any given rate-pair. The main idea is to apply source-splitting using orthogonal-transforms, so that only Wyner-Ziv (WZ) quantizers are required for compression of transform coefficients. This approach however requires optimizing the bit allocation among dependent sets of WZ quantizers. In order to solve this problem, a low-complexity tree-search algorithm based on analytical models for transform coefficient quantization is developed. A rate-distortion (RD) analysis of SP-DTCs for jointly Gaussian sources is presented, which indicates that these codes can significantly outperform the practical alternative of independent TC of each source, whenever there is a strong correlation between the sources. For practical implementation of SP-DTCs, the idea of using conditional entropy constrained (CEC) quantizers followed by Slepian-Wolf coding is explored. Experimental results obtained with SP-DTC designs based on both CEC scalar quantizers and CEC trellis-coded quantizers demonstrate that actual implementations of SP-DTCs can achieve RD performance close to the analytically predicted limits.

  3. Arithmetic coding as a non-linear dynamical system

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Vaidya, Prabhakar G.; Bhat, Kishor G.

    2009-04-01

    In order to perform source coding (data compression), we treat messages emitted by independent and identically distributed sources as imprecise measurements (symbolic sequence) of a chaotic, ergodic, Lebesgue measure preserving, non-linear dynamical system known as Generalized Luröth Series (GLS). GLS achieves Shannon's entropy bound and turns out to be a generalization of arithmetic coding, a popular source coding algorithm, used in international compression standards such as JPEG2000 and H.264. We further generalize GLS to piecewise non-linear maps (Skewed-nGLS). We motivate the use of Skewed-nGLS as a framework for joint source coding and encryption.

  4. Parallelizing the XSTAR Photoionization Code

    NASA Astrophysics Data System (ADS)

    Noble, M. S.; Ji, L.; Young, A.; Lee, J. C.

    2009-09-01

    We describe two means by which XSTAR, a code which computes physical conditions and emission spectra of photoionized gases, has been parallelized. The first is pvmxstar, a wrapper which can be used in place of the serial xstar2xspec script to foster concurrent execution of the XSTAR command line application on independent sets of parameters. The second is pmodel, a plugin for the Interactive Spectral Interpretation System (ISIS) which allows arbitrary components of a broad range of astrophysical models to be distributed across processors during fitting and confidence limits calculations, by scientists with little training in parallel programming. Plugging the XSTAR family of analytic models into pmodel enables multiple ionization states (e.g., of a complex absorber/emitter) to be computed simultaneously, alleviating the often prohibitive expense of the traditional serial approach. Initial performance results indicate that these methods substantially enlarge the problem space to which XSTAR may be applied within practical timeframes.

  5. Pyramid image codes

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1990-01-01

    All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

  6. Report number codes

    SciTech Connect

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  7. "Hour of Code": Can It Change Students' Attitudes toward Programming?

    ERIC Educational Resources Information Center

    Du, Jie; Wimmer, Hayden; Rada, Roy

    2016-01-01

    The Hour of Code is a one-hour introduction to computer science organized by Code.org, a non-profit dedicated to expanding participation in computer science. This study investigated the impact of the Hour of Code on students' attitudes towards computer programming and their knowledge of programming. A sample of undergraduate students from two…

  8. Naming problems do not reflect a second independent core deficit in dyslexia: double deficits explored.

    PubMed

    Vaessen, Anniek; Gerretsen, Patty; Blomert, Leo

    2009-06-01

    The double deficit hypothesis states that naming speed problems represent a second core deficit in dyslexia independent from a phonological deficit. The current study investigated the main assumptions of this hypothesis in a large sample of well-diagnosed dyslexics. The three main findings were that (a) naming speed was consistently related only to reading speed; (b) phonological processing speed and naming speed loaded on the same factor, and this factor contributed strongly to reading speed; and (c) although general processing speed was involved in speeded naming of visual items, it did not explain the relationship between naming speed and reading speed. The results do not provide support for the existence of a second independent core naming deficit in dyslexia and indicate that speeded naming tasks are mainly phonological processing speed tasks with an important addition: fast cross-modal matching of visual symbols and phonological codes.

  9. Independent component representations for face recognition

    NASA Astrophysics Data System (ADS)

    Stewart Bartlett, Marian; Lades, Martin H.; Sejnowski, Terrence J.

    1998-07-01

    In a task such as face recognition, much of the important information may be contained in the high-order relationships among the image pixels. A number of face recognition algorithms employ principal component analysis (PCA), which is based on the second-order statistics of the image set, and does not address high-order statistical dependencies such as the relationships among three or more pixels. Independent component analysis (ICA) is a generalization of PCA which separates the high-order moments of the input in addition to the second-order moments. ICA was performed on a set of face images by an unsupervised learning algorithm derived from the principle of optimal information transfer through sigmoidal neurons. The algorithm maximizes the mutual information between the input and the output, which produces statistically independent outputs under certain conditions. ICA was performed on the face images under two different architectures. The first architecture provided a statistically independent basis set for the face images that can be viewed as a set of independent facial features. The second architecture provided a factorial code, in which the probability of any combination of features can be obtained from the product of their individual probabilities. Both ICA representations were superior to representations based on principal components analysis for recognizing faces across sessions and changes in expression.

  10. 3D neutronic codes coupled with thermal-hydraulic system codes for PWR, and BWR and VVER reactors

    SciTech Connect

    Langenbuch, S.; Velkov, K.; Lizorkin, M.

    1997-07-01

    This paper describes the objectives of code development for coupling 3D neutronics codes with thermal-hydraulic system codes. The present status of coupling ATHLET with three 3D neutronics codes for VVER- and LWR-reactors is presented. After describing the basic features of the 3D neutronic codes BIPR-8 from Kurchatov-Institute, DYN3D from Research Center Rossendorf and QUABOX/CUBBOX from GRS, first applications of coupled codes for different transient and accident scenarios are presented. The need of further investigations is discussed.

  11. Information coding in artificial olfaction multisensor arrays.

    PubMed

    Albert, Keith J; Walt, David R

    2003-08-15

    High-density sensor arrays were prepared with microbead vapor sensors to explore and compare the information coded in sensor response profiles following odor stimulus. The coded information in the sensor-odor response profiles, which is used for odor discrimination purposes, was extracted from the microsensor arrays via two different approaches. In the first approach, the responses from individual microsensors were separated (decoded array) and independently processed. In the second approach, response profiles from all microsensors within the entire array, i.e., the sensor ensemble, were combined to create one response per odor stimulus (nondecoded array). Although the amount of response data is markedly reduced in the second approach, the system shows comparable odor discrimination rates for the two signal extraction methods. The ensemble approach streamlines system resources without decreasing system performance. These signal compression approaches may simulate or parallel information coding in the mammalian olfactory system. PMID:14632130

  12. Frame independent cosmological perturbations

    SciTech Connect

    Prokopec, Tomislav; Weenink, Jan E-mail: j.g.weenink@uu.nl

    2013-09-01

    We compute the third order gauge invariant action for scalar-graviton interactions in the Jordan frame. We demonstrate that the gauge invariant action for scalar and tensor perturbations on one physical hypersurface only differs from that on another physical hypersurface via terms proportional to the equation of motion and boundary terms, such that the evolution of non-Gaussianity may be called unique. Moreover, we demonstrate that the gauge invariant curvature perturbation and graviton on uniform field hypersurfaces in the Jordan frame are equal to their counterparts in the Einstein frame. These frame independent perturbations are therefore particularly useful in relating results in different frames at the perturbative level. On the other hand, the field perturbation and graviton on uniform curvature hypersurfaces in the Jordan and Einstein frame are non-linearly related, as are their corresponding actions and n-point functions.

  13. Honor Codes and Other Contextual Influences on Academic Integrity: A Replication and Extension to Modified Honor Code Settings.

    ERIC Educational Resources Information Center

    McCabe, Donald L.; Trevino, Linda Klebe; Butterfield, Kenneth D.

    2002-01-01

    Investigated the influence of modified honor codes, an alternative to traditional codes that is gaining popularity on larger campuses. Also tested the model of student academic dishonesty previously suggested by McCabe and Trevino. Found that modified honor codes are associated with lower levels of student dishonesty and that the McCabe Trevino…

  14. Parallel CARLOS-3D code development

    SciTech Connect

    Putnam, J.M.; Kotulski, J.D.

    1996-02-01

    CARLOS-3D is a three-dimensional scattering code which was developed under the sponsorship of the Electromagnetic Code Consortium, and is currently used by over 80 aerospace companies and government agencies. The code has been extensively validated and runs on both serial workstations and parallel super computers such as the Intel Paragon. CARLOS-3D is a three-dimensional surface integral equation scattering code based on a Galerkin method of moments formulation employing Rao- Wilton-Glisson roof-top basis for triangular faceted surfaces. Fully arbitrary 3D geometries composed of multiple conducting and homogeneous bulk dielectric materials can be modeled. This presentation describes some of the extensions to the CARLOS-3D code, and how the operator structure of the code facilitated these improvements. Body of revolution (BOR) and two-dimensional geometries were incorporated by simply including new input routines, and the appropriate Galerkin matrix operator routines. Some additional modifications were required in the combined field integral equation matrix generation routine due to the symmetric nature of the BOR and 2D operators. Quadrilateral patched surfaces with linear roof-top basis functions were also implemented in the same manner. Quadrilateral facets and triangular facets can be used in combination to more efficiently model geometries with both large smooth surfaces and surfaces with fine detail such as gaps and cracks. Since the parallel implementation in CARLOS-3D is at high level, these changes were independent of the computer platform being used. This approach minimizes code maintenance, while providing capabilities with little additional effort. Results are presented showing the performance and accuracy of the code for some large scattering problems. Comparisons between triangular faceted and quadrilateral faceted geometry representations will be shown for some complex scatterers.

  15. Compressible Astrophysics Simulation Code

    SciTech Connect

    Howell, L.; Singer, M.

    2007-07-18

    This is an astrophysics simulation code involving a radiation diffusion module developed at LLNL coupled to compressible hydrodynamics and adaptive mesh infrastructure developed at LBNL. One intended application is to neutrino diffusion in core collapse supernovae.

  16. Optimal source codes for geometrically distributed integer alphabets

    NASA Technical Reports Server (NTRS)

    Gallager, R. G.; Van Voorhis, D. C.

    1975-01-01

    An approach is shown for using the Huffman algorithm indirectly to prove the optimality of a code for an infinite alphabet if an estimate concerning the nature of the code can be made. Attention is given to nonnegative integers with a geometric probability assignment. The particular distribution considered arises in run-length coding and in encoding protocol information in data networks. Questions of redundancy of the optimal code are also investigated.

  17. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  18. Applications of numerical codes to space plasma problems

    NASA Technical Reports Server (NTRS)

    Northrop, T. G.; Birmingham, T. J.; Jones, F. C.; Wu, C. S.

    1975-01-01

    Solar wind, earth's bowshock, and magnetospheric convection and substorms were investigated. Topics discussed include computational physics, multifluid codes, ionospheric irregularities, and modeling laser plasmas.

  19. Robust Nonlinear Neural Codes

    NASA Astrophysics Data System (ADS)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  20. KENO-V code

    SciTech Connect

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P/sub 1/ scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes.

  1. Multi-level bandwidth efficient block modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1989-01-01

    The multilevel technique is investigated for combining block coding and modulation. There are four parts. In the first part, a formulation is presented for signal sets on which modulation codes are to be constructed. Distance measures on a signal set are defined and their properties are developed. In the second part, a general formulation is presented for multilevel modulation codes in terms of component codes with appropriate Euclidean distances. The distance properties, Euclidean weight distribution and linear structure of multilevel modulation codes are investigated. In the third part, several specific methods for constructing multilevel block modulation codes with interdependency among component codes are proposed. Given a multilevel block modulation code C with no interdependency among the binary component codes, the proposed methods give a multilevel block modulation code C which has the same rate as C, a minimum squared Euclidean distance not less than that of code C, a trellis diagram with the same number of states as that of C and a smaller number of nearest neighbor codewords than that of C. In the last part, error performance of block modulation codes is analyzed for an AWGN channel based on soft-decision maximum likelihood decoding. Error probabilities of some specific codes are evaluated based on their Euclidean weight distributions and simulation results.

  2. Coded aperture compressive temporal imaging.

    PubMed

    Llull, Patrick; Liao, Xuejun; Yuan, Xin; Yang, Jianbo; Kittle, David; Carin, Lawrence; Sapiro, Guillermo; Brady, David J

    2013-05-01

    We use mechanical translation of a coded aperture for code division multiple access compression of video. We discuss the compressed video's temporal resolution and present experimental results for reconstructions of > 10 frames of temporal data per coded snapshot.

  3. Population coding of affect across stimuli, modalities and individuals

    PubMed Central

    Chikazoe, Junichi; Lee, Daniel H.; Kriegeskorte, Nikolaus; Anderson, Adam K.

    2014-01-01

    It remains unclear how the brain represents external objective sensory events alongside our internal subjective impressions of them—affect. Representational mapping of population level activity evoked by complex scenes and basic tastes uncovered a neural code supporting a continuous axis of pleasant-to-unpleasant valence. This valence code was distinct from low-level physical and high-level object properties. While ventral temporal and anterior insular cortices supported valence codes specific to vision and taste, both the medial and lateral orbitofrontal cortices (OFC), maintained a valence code independent of sensory origin. Further only the OFC code could classify experienced affect across participants. The entire valence spectrum is represented as a collective pattern in regional neural activity as sensory-specific and abstract codes, whereby the subjective quality of affect can be objectively quantified across stimuli, modalities, and people. PMID:24952643

  4. Methodology, status and plans for development and assessment of the code ATHLET

    SciTech Connect

    Teschendorff, V.; Austregesilo, H.; Lerchl, G.

    1997-07-01

    The thermal-hydraulic computer code ATHLET (Analysis of THermal-hydraulics of LEaks and Transients) is being developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) for the analysis of anticipated and abnormal plant transients, small and intermediate leaks as well as large breaks in light water reactors. The aim of the code development is to cover the whole spectrum of design basis and beyond design basis accidents (without core degradation) for PWRs and BWRs with only one code. The main code features are: advanced thermal-hydraulics; modular code architecture; separation between physical models and numerical methods; pre- and post-processing tools; portability. The code has features that are of special interest for applications to small leaks and transients with accident management, e.g. initialization by a steady-state calculation, full-range drift-flux model, dynamic mixture level tracking. The General Control Simulation Module of ATHLET is a flexible tool for the simulation of the balance-of-plant and control systems including the various operator actions in the course of accident sequences with AM measures. The code development is accompained by a systematic and comprehensive validation program. A large number of integral experiments and separate effect tests, including the major International Standard Problems, have been calculated by GRS and by independent organizations. The ATHLET validation matrix is a well balanced set of integral and separate effects tests derived from the CSNI proposal emphasizing, however, the German combined ECC injection system which was investigated in the UPTF, PKL and LOBI test facilities.

  5. Reversibility and efficiency in coding protein information.

    PubMed

    Tamir, Boaz; Priel, Avner

    2010-12-21

    Why the genetic code has a fixed length? Protein information is transferred by coding each amino acid using codons whose length equals 3 for all amino acids. Hence the most probable and the least probable amino acid get a codeword with an equal length. Moreover, the distributions of amino acids found in nature are not uniform and therefore the efficiency of such codes is sub-optimal. The origins of these apparently non-efficient codes are yet unclear. In this paper we propose an a priori argument for the energy efficiency of such codes resulting from their reversibility, in contrast to their time inefficiency. Such codes are reversible in the sense that a primitive processor, reading three letters in each step, can always reverse its operation, undoing its process. We examine the codes for the distributions of amino acids that exist in nature and show that they could not be both time efficient and reversible. We investigate a family of Zipf-type distributions and present their efficient (non-fixed length) prefix code, their graphs, and the condition for their reversibility. We prove that for a large family of such distributions, if the code is time efficient, it could not be reversible. In other words, if pre-biotic processes demand reversibility, the protein code could not be time efficient. The benefits of reversibility are clear: reversible processes are adiabatic, namely, they dissipate a very small amount of energy. Such processes must be done slowly enough; therefore time efficiency is non-important. It is reasonable to assume that early biochemical complexes were more prone towards energy efficiency, where forward and backward processes were almost symmetrical. PMID:20868696

  6. Studying the Independent School Library

    ERIC Educational Resources Information Center

    Cahoy, Ellysa Stern; Williamson, Susan G.

    2008-01-01

    In 2005, the American Association of School Librarians' Independent Schools Section conducted a national survey of independent school libraries. This article analyzes the results of the survey, reporting specialized data and information regarding independent school library budgets, collections, services, facilities, and staffing. Additionally, the…

  7. Prioritized LT Codes

    NASA Technical Reports Server (NTRS)

    Woo, Simon S.; Cheng, Michael K.

    2011-01-01

    The original Luby Transform (LT) coding scheme is extended to account for data transmissions where some information symbols in a message block are more important than others. Prioritized LT codes provide unequal error protection (UEP) of data on an erasure channel by modifying the original LT encoder. The prioritized algorithm improves high-priority data protection without penalizing low-priority data recovery. Moreover, low-latency decoding is also obtained for high-priority data due to fast encoding. Prioritized LT codes only require a slight change in the original encoding algorithm, and no changes at all at the decoder. Hence, with a small complexity increase in the LT encoder, an improved UEP and low-decoding latency performance for high-priority data can be achieved. LT encoding partitions a data stream into fixed-sized message blocks each with a constant number of information symbols. To generate a code symbol from the information symbols in a message, the Robust-Soliton probability distribution is first applied in order to determine the number of information symbols to be used to compute the code symbol. Then, the specific information symbols are chosen uniform randomly from the message block. Finally, the selected information symbols are XORed to form the code symbol. The Prioritized LT code construction includes an additional restriction that code symbols formed by a relatively small number of XORed information symbols select some of these information symbols from the pool of high-priority data. Once high-priority data are fully covered, encoding continues with the conventional LT approach where code symbols are generated by selecting information symbols from the entire message block including all different priorities. Therefore, if code symbols derived from high-priority data experience an unusual high number of erasures, Prioritized LT codes can still reliably recover both high- and low-priority data. This hybrid approach decides not only "how to encode

  8. Code Seal v 1.0

    2009-12-11

    CodeSeal is a Sandia National Laboratories developed technology that provides a means of securely obfuscating finite state machines in a mathematically provable way. The technology was developed in order to provide a solution for anti-reverse engineering, assured execution, and integrity of execution. CodeSeal accomplishes these goals with the addition of the concept of a trust anchor, a small piece of trust integrated into the system, to the model of code obfuscation. Code obfuscation is anmore » active area of academic research, but most findings have merely demonstrated that general obfuscation is impossible. By modifying the security model such that we may rely on the presence of a small, tamper-protected device, however, Sandia has developed an effective method for obfuscating code. An open publication describing the technology in more detail can be found at http://eprint.iacr.org/2008/184.pdf.Independent Software/Hardware monitors, Use control, Supervisory Control And Data Acquisition (SCADA), Algorithm obfuscation« less

  9. Induction technology optimization code

    SciTech Connect

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-08-21

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. The Induction Technology Optimization Study (ITOS) was undertaken to examine viable combinations of a linear induction accelerator and a relativistic klystron (RK) for high power microwave production. It is proposed, that microwaves from the RK will power a high-gradient accelerator structure for linear collider development. Previous work indicates that the RK will require a nominal 3-MeV, 3-kA electron beam with a 100-ns flat top. The proposed accelerator-RK combination will be a high average power system capable of sustained microwave output at a 300-Hz pulse repetition frequency. The ITOS code models many combinations of injector, accelerator, and pulse power designs that will supply an RK with the beam parameters described above.

  10. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  11. Code query by example

    NASA Astrophysics Data System (ADS)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  12. Adaptation and visual coding

    PubMed Central

    Webster, Michael A.

    2011-01-01

    Visual coding is a highly dynamic process and continuously adapting to the current viewing context. The perceptual changes that result from adaptation to recently viewed stimuli remain a powerful and popular tool for analyzing sensory mechanisms and plasticity. Over the last decade, the footprints of this adaptation have been tracked to both higher and lower levels of the visual pathway and over a wider range of timescales, revealing that visual processing is much more adaptable than previously thought. This work has also revealed that the pattern of aftereffects is similar across many stimulus dimensions, pointing to common coding principles in which adaptation plays a central role. However, why visual coding adapts has yet to be fully answered. PMID:21602298

  13. FAA Smoke Transport Code

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a codemore » obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.« less

  14. Autocatalysis, information and coding.

    PubMed

    Wills, P R

    2001-01-01

    Autocatalytic self-construction in macromolecular systems requires the existence of a reflexive relationship between structural components and the functional operations they perform to synthesise themselves. The possibility of reflexivity depends on formal, semiotic features of the catalytic structure-function relationship, that is, the embedding of catalytic functions in the space of polymeric structures. Reflexivity is a semiotic property of some genetic sequences. Such sequences may serve as the basis for the evolution of coding as a result of autocatalytic self-organisation in a population of assignment catalysts. Autocatalytic selection is a mechanism whereby matter becomes differentiated in primitive biochemical systems. In the case of coding self-organisation, it corresponds to the creation of symbolic information. Prions are present-day entities whose replication through autocatalysis reflects aspects of biological semiotics less obvious than genetic coding.

  15. Temporal Coding of Volumetric Imagery

    NASA Astrophysics Data System (ADS)

    Llull, Patrick Ryan

    'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration

  16. Identifying personal microbiomes using metagenomic codes

    PubMed Central

    Franzosa, Eric A.; Huang, Katherine; Meadow, James F.; Gevers, Dirk; Lemon, Katherine P.; Bohannan, Brendan J. M.; Huttenhower, Curtis

    2015-01-01

    Community composition within the human microbiome varies across individuals, but it remains unknown if this variation is sufficient to uniquely identify individuals within large populations or stable enough to identify them over time. We investigated this by developing a hitting set-based coding algorithm and applying it to the Human Microbiome Project population. Our approach defined body site-specific metagenomic codes: sets of microbial taxa or genes prioritized to uniquely and stably identify individuals. Codes capturing strain variation in clade-specific marker genes were able to distinguish among 100s of individuals at an initial sampling time point. In comparisons with follow-up samples collected 30–300 d later, ∼30% of individuals could still be uniquely pinpointed using metagenomic codes from a typical body site; coincidental (false positive) matches were rare. Codes based on the gut microbiome were exceptionally stable and pinpointed >80% of individuals. The failure of a code to match its owner at a later time point was largely explained by the loss of specific microbial strains (at current limits of detection) and was only weakly associated with the length of the sampling interval. In addition to highlighting patterns of temporal variation in the ecology of the human microbiome, this work demonstrates the feasibility of microbiome-based identifiability—a result with important ethical implications for microbiome study design. The datasets and code used in this work are available for download from huttenhower.sph.harvard.edu/idability. PMID:25964341

  17. Scalable L-infinite coding of meshes.

    PubMed

    Munteanu, Adrian; Cernea, Dan C; Alecu, Alin; Cornelis, Jan; Schelkens, Peter

    2010-01-01

    The paper investigates the novel concept of local-error control in mesh geometry encoding. In contrast to traditional mesh-coding systems that use the mean-square error as target distortion metric, this paper proposes a new L-infinite mesh-coding approach, for which the target distortion metric is the L-infinite distortion. In this context, a novel wavelet-based L-infinite-constrained coding approach for meshes is proposed, which ensures that the maximum error between the vertex positions in the original and decoded meshes is lower than a given upper bound. Furthermore, the proposed system achieves scalability in L-infinite sense, that is, any decoding of the input stream will correspond to a perfectly predictable L-infinite distortion upper bound. An instantiation of the proposed L-infinite-coding approach is demonstrated for MESHGRID, which is a scalable 3D object encoding system, part of MPEG-4 AFX. In this context, the advantages of scalable L-infinite coding over L-2-oriented coding are experimentally demonstrated. One concludes that the proposed L-infinite mesh-coding approach guarantees an upper bound on the local error in the decoded mesh, it enables a fast real-time implementation of the rate allocation, and it preserves all the scalability features and animation capabilities of the employed scalable mesh codec. PMID:20224144

  18. On Coding Non-Contiguous Letter Combinations

    PubMed Central

    Dandurand, Frédéric; Grainger, Jonathan; Duñabeitia, Jon Andoni; Granier, Jean-Pierre

    2011-01-01

    Starting from the hypothesis that printed word identification initially involves the parallel mapping of visual features onto location-specific letter identities, we analyze the type of information that would be involved in optimally mapping this location-specific orthographic code onto a location-invariant lexical code. We assume that some intermediate level of coding exists between individual letters and whole words, and that this involves the representation of letter combinations. We then investigate the nature of this intermediate level of coding given the constraints of optimality. This intermediate level of coding is expected to compress data while retaining as much information as possible about word identity. Information conveyed by letters is a function of how much they constrain word identity and how visible they are. Optimization of this coding is a combination of minimizing resources (using the most compact representations) and maximizing information. We show that in a large proportion of cases, non-contiguous letter sequences contain more information than contiguous sequences, while at the same time requiring less precise coding. Moreover, we found that the best predictor of human performance in orthographic priming experiments was within-word ranking of conditional probabilities, rather than average conditional probabilities. We conclude that from an optimality perspective, readers learn to select certain contiguous and non-contiguous letter combinations as information that provides the best cue to word identity. PMID:21734901

  19. Clinical Reasoning of Physical Therapists regarding In-hospital Walking Independence of Patients with Hemiplegia

    PubMed Central

    Takahashi, Junpei; Takami, Akiyoshi; Wakayama, Saichi

    2014-01-01

    [Purpose] Physical therapists must often determine whether hemiparetic patients can walk independently. However, there are no criteria, so decisions are often left to individual physical therapists. The purpose of this study was to explore how physical therapists determine whether a patient with hemiplegia can walk independently in a ward. [Methods] The subjects were 15 physical therapists with experience of stroke patients’ rehabilitation. We interviewed them using semi-structured interviews related to the criteria of the states of walking in the ward of hemiparetic patients. The interviews were transcribed in full, and the texts were analyzed by coding and grouping. [Results] From the results of the interviews, PTs determined patients’ independence of walking in hospital by observation of behavior during walking or treatment. The majority of PTs focused on the patients’ state during walking, higher brain function, and their ability to balance. In addition, they often asked ward staff about patients’ daily life, and self-determination. [Conclusions] We identified the items examined by physical therapists when determining the in-hospital walking independence of stroke patients. Further investigation is required to examine which of these items are truly necessary. PMID:24926149

  20. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option

  1. The Comparative Performance of Conditional Independence Indices

    ERIC Educational Resources Information Center

    Kim, Doyoung; De Ayala, R. J.; Ferdous, Abdullah A.; Nering, Michael L.

    2011-01-01

    To realize the benefits of item response theory (IRT), one must have model-data fit. One facet of a model-data fit investigation involves assessing the tenability of the conditional item independence (CII) assumption. In this Monte Carlo study, the comparative performance of 10 indices for identifying conditional item dependence is assessed. The…

  2. Securing mobile code.

    SciTech Connect

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements

  3. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor

  4. Light curves for bump Cepheids computed with a dynamically zoned pulsation code

    NASA Astrophysics Data System (ADS)

    Adams, T. F.; Castor, J. I.; Davis, C. G.

    1980-05-01

    The dynamically zoned pulsation code developed by Castor, Davis, and Davison was used to recalculate the Goddard model and to calculate three other Cepheid models with the same period (9.8 days). This family of models shows how the bumps and other features of the light and velocity curves change as the mass is varied at constant period. The use of a code that is capable of producing reliable light curves demonstrates that the light and velocity curves for 9.8 day Cepheid models with standard homogeneous compositions do not show bumps like those that are observed unless the mass is significantly lower than the 'evolutionary mass.' The light and velocity curves for the Goddard model presented here are similar to those computed independently by Fischel, Sparks, and Karp. They should be useful as standards for future investigators.

  5. Light curves for bump Cepheids computed with a dynamically zoned pulsation code

    NASA Technical Reports Server (NTRS)

    Adams, T. F.; Castor, J. I.; Davis, C. G.

    1980-01-01

    The dynamically zoned pulsation code developed by Castor, Davis, and Davison was used to recalculate the Goddard model and to calculate three other Cepheid models with the same period (9.8 days). This family of models shows how the bumps and other features of the light and velocity curves change as the mass is varied at constant period. The use of a code that is capable of producing reliable light curves demonstrates that the light and velocity curves for 9.8 day Cepheid models with standard homogeneous compositions do not show bumps like those that are observed unless the mass is significantly lower than the 'evolutionary mass.' The light and velocity curves for the Goddard model presented here are similar to those computed independently by Fischel, Sparks, and Karp. They should be useful as standards for future investigators.

  6. High-Speed Soft-Decision Decoding of Two Reed-Muller Codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Uehara, Gregory T.

    1996-01-01

    In this research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing, a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RM subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second (Mbps). The combination of a large number of states and a high data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study, which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high-speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating, and present some of the key architectural approaches being used to

  7. High-Speed Soft-Decision Decoding of Two Reed-Muller Codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Uehara, Gregory T.

    1996-01-01

    In his research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RNI subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second (Mbps). The combination of a large number of states and a hi ch data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high-speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating and present some of the key architectural approaches being used to

  8. Numerical MHD codes for modeling astrophysical flows

    NASA Astrophysics Data System (ADS)

    Koldoba, A. V.; Ustyugova, G. V.; Lii, P. S.; Comins, M. L.; Dyda, S.; Romanova, M. M.; Lovelace, R. V. E.

    2016-05-01

    We describe a Godunov-type magnetohydrodynamic (MHD) code based on the Miyoshi and Kusano (2005) solver which can be used to solve various astrophysical hydrodynamic and MHD problems. The energy equation is in the form of entropy conservation. The code has been implemented on several different coordinate systems: 2.5D axisymmetric cylindrical coordinates, 2D Cartesian coordinates, 2D plane polar coordinates, and fully 3D cylindrical coordinates. Viscosity and diffusivity are implemented in the code to control the accretion rate in the disk and the rate of penetration of the disk matter through the magnetic field lines. The code has been utilized for the numerical investigations of a number of different astrophysical problems, several examples of which are shown.

  9. Modular optimization code package: MOZAIK

    NASA Astrophysics Data System (ADS)

    Bekar, Kursat B.

    This dissertation addresses the development of a modular optimization code package, MOZAIK, for geometric shape optimization problems in nuclear engineering applications. MOZAIK's first mission, determining the optimal shape of the D2O moderator tank for the current and new beam tube configurations for the Penn State Breazeale Reactor's (PSBR) beam port facility, is used to demonstrate its capabilities and test its performance. MOZAIK was designed as a modular optimization sequence including three primary independent modules: the initializer, the physics and the optimizer, each having a specific task. By using fixed interface blocks among the modules, the code attains its two most important characteristics: generic form and modularity. The benefit of this modular structure is that the contents of the modules can be switched depending on the requirements of accuracy, computational efficiency, or compatibility with the other modules. Oak Ridge National Laboratory's discrete ordinates transport code TORT was selected as the transport solver in the physics module of MOZAIK, and two different optimizers, Min-max and Genetic Algorithms (GA), were implemented in the optimizer module of the code package. A distributed memory parallelism was also applied to MOZAIK via MPI (Message Passing Interface) to execute the physics module concurrently on a number of processors for various states in the same search. Moreover, dynamic scheduling was enabled to enhance load balance among the processors while running MOZAIK's physics module thus improving the parallel speedup and efficiency. In this way, the total computation time consumed by the physics module is reduced by a factor close to M, where M is the number of processors. This capability also encourages the use of MOZAIK for shape optimization problems in nuclear applications because many traditional codes related to radiation transport do not have parallel execution capability. A set of computational models based on the

  10. Codes with Monotonic Codeword Lengths.

    ERIC Educational Resources Information Center

    Abrahams, Julia

    1994-01-01

    Discusses the minimum average codeword length coding under the constraint that the codewords are monotonically nondecreasing in length. Bounds on the average length of an optimal monotonic code are derived, and sufficient conditions are given such that algorithms for optimal alphabetic codes can be used to find the optimal monotonic code. (six…

  11. Accumulate Repeat Accumulate Coded Modulation

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

  12. Benchmark study between FIDAP and a cellular automata code

    NASA Astrophysics Data System (ADS)

    Akau, R. L.; Stockman, H. W.

    A fluid flow benchmark exercise was conducted to compare results between a cellular automata code and FIDAP. Cellular automata codes are free from gridding constraints, and are generally used to model slow (Reynolds number approximately 1) flows around complex solid obstacles. However, the accuracy of cellular automata codes at higher Reynolds numbers, where inertial terms are significant, is not well-documented. In order to validate the cellular automata code, two fluids problems were investigated. For both problems, flow was assumed to be laminar, two-dimensional, isothermal, incompressible and periodic. Results showed that the cellular automata code simulated the overall behavior of the flow field.

  13. Electrical Circuit Simulation Code

    SciTech Connect

    Wix, Steven D.; Waters, Arlon J.; Shirley, David

    2001-08-09

    Massively-Parallel Electrical Circuit Simulation Code. CHILESPICE is a massively-arallel distributed-memory electrical circuit simulation tool that contains many enhanced radiation, time-based, and thermal features and models. Large scale electronic circuit simulation. Shared memory, parallel processing, enhance convergence. Sandia specific device models.

  14. The revised genetic code

    NASA Astrophysics Data System (ADS)

    Ninio, Jacques

    1990-03-01

    Recent findings on the genetic code are reviewed, including selenocysteine usage, deviations in the assignments of sense and nonsense codons, RNA editing, natural ribosomal frameshifts and non-orthodox codon-anticodon pairings. A multi-stage codon reading process is presented.

  15. Dual Coding in Children.

    ERIC Educational Resources Information Center

    Burton, John K.; Wildman, Terry M.

    The purpose of this study was to test the applicability of the dual coding hypothesis to children's recall performance. The hypothesis predicts that visual interference will have a small effect on the recall of visually presented words or pictures, but that acoustic interference will cause a decline in recall of visually presented words and…

  16. Dress Codes and Uniforms.

    ERIC Educational Resources Information Center

    Lumsden, Linda; Miller, Gabriel

    2002-01-01

    Students do not always make choices that adults agree with in their choice of school dress. Dress-code issues are explored in this Research Roundup, and guidance is offered to principals seeking to maintain a positive school climate. In "Do School Uniforms Fit?" Kerry White discusses arguments for and against school uniforms and summarizes the…

  17. Code of Ethics.

    ERIC Educational Resources Information Center

    Association of College Unions-International, Bloomington, IN.

    The code of ethics for the college union and student activities professional is presented by the Association of College Unions-International. The preamble identifies the objectives of the college union as providing campus community centers and social programs that enhance the quality of life for members of the academic community. Ethics for…

  18. Odor Coding Sensor

    NASA Astrophysics Data System (ADS)

    Hayashi, Kenshi

    Odor is a one of important sensing parameters for human life. However, odor has not been quantified by a measuring instrument because of its vagueness. In this paper, a measuring of odor with odor coding, which are vector quantities of plural odor molecular information, and its applications are described.

  19. Sharing the Code.

    ERIC Educational Resources Information Center

    Olsen, Florence

    2003-01-01

    Colleges and universities are beginning to consider collaborating on open-source-code projects as a way to meet critical software and computing needs. Points out the attractive features of noncommercial open-source software and describes some examples in use now, especially for the creation of Web infrastructure. (SLD)

  20. Building Codes and Regulations.

    ERIC Educational Resources Information Center

    Fisher, John L.

    The hazard of fire is of great concern to libraries due to combustible books and new plastics used in construction and interiors. Building codes and standards can offer architects and planners guidelines to follow but these standards should be closely monitored, updated, and researched for fire prevention. (DS)

  1. Code Optimization Techniques

    SciTech Connect

    MAGEE,GLEN I.

    2000-08-03

    Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flight modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.

  2. The Redox Code

    PubMed Central

    Jones, Dean P.

    2015-01-01

    Abstract Significance: The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O2 and H2O2 contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Recent Advances: Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Critical Issues: Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Future Directions: Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine. Antioxid. Redox Signal. 23, 734–746. PMID:25891126

  3. Code of Ethics.

    ERIC Educational Resources Information Center

    American Sociological Association, Washington, DC.

    The American Sociological Association's code of ethics for sociologists is presented. For sociological research and practice, 10 requirements for ethical behavior are identified, including: maintaining objectivity and integrity; fully reporting findings and research methods, without omission of significant data; reporting fully all sources of…

  4. Automated searching for quantum subsystem codes

    SciTech Connect

    Crosswhite, Gregory M.; Bacon, Dave

    2011-02-15

    Quantum error correction allows for faulty quantum systems to behave in an effectively error-free manner. One important class of techniques for quantum error correction is the class of quantum subsystem codes, which are relevant both to active quantum error-correcting schemes as well as to the design of self-correcting quantum memories. Previous approaches for investigating these codes have focused on applying theoretical analysis to look for interesting codes and to investigate their properties. In this paper we present an alternative approach that uses computational analysis to accomplish the same goals. Specifically, we present an algorithm that computes the optimal quantum subsystem code that can be implemented given an arbitrary set of measurement operators that are tensor products of Pauli operators. We then demonstrate the utility of this algorithm by performing a systematic investigation of the quantum subsystem codes that exist in the setting where the interactions are limited to two-body interactions between neighbors on lattices derived from the convex uniform tilings of the plane.

  5. Ideal Binocular Disparity Detectors Learned Using Independent Subspace Analysis on Binocular Natural Image Pairs

    PubMed Central

    Hunter, David W.; Hibbard, Paul B.

    2016-01-01

    An influential theory of mammalian vision, known as the efficient coding hypothesis, holds that early stages in the visual cortex attempts to form an efficient coding of ecologically valid stimuli. Although numerous authors have successfully modelled some aspects of early vision mathematically, closer inspection has found substantial discrepancies between the predictions of some of these models and observations of neurons in the visual cortex. In particular analysis of linear-non-linear models of simple-cells using Independent Component Analysis has found a strong bias towards features on the horoptor. In order to investigate the link between the information content of binocular images, mathematical models of complex cells and physiological recordings, we applied Independent Subspace Analysis to binocular image patches in order to learn a set of complex-cell-like models. We found that these complex-cell-like models exhibited a wide range of binocular disparity-discriminability, although only a minority exhibited high binocular discrimination scores. However, in common with the linear-non-linear model case we found that feature detection was limited to the horoptor suggesting that current mathematical models are limited in their ability to explain the functionality of the visual cortex. PMID:26982184

  6. Fire investigation

    NASA Astrophysics Data System (ADS)

    Gomberg, A.

    There was considerable progress made on several fronts of fire investigation in the United States in recent years. Progress was made in increasing the quantity of fire investigation and reporting, through efforts to develop the National Fire Incident Reporting System. Improving overall quality of fire investigation is the objective of efforts such as the Fire Investigation Handbook, which was developed and published by the National Bureau of Standards, and the upgrading and expanding of the ""dictionary'' of fire investigation and reporting, the NFPA 901, Uniform Coding for Fire Protection, system. The science of fire investigation as furthered also by new approaches to post fire interviews being developed at the University of Washington, and by in-depth research into factors involved in several large loss fires, including the MGM Grand Hotel in Las Vegas. Finally, the use of special study fire investigations - in-depth investigations concentrating on specific fire problems - is producing new glimpses into the nature of the national fire problem. A brief description of the status of efforts in each of these areas is discussed.

  7. FEFF5: An ab initio multiple scattering XAFS code

    SciTech Connect

    Rehr, J.J.; Zabinsky, S.I.

    1992-12-31

    FEFF5 is an efficient automated code which calculates multiple scattering (MS) curved wave XAFS spectra for molecules and solids. The theoretical ingredients and approximations contained in the code are revised, with the aim of describing the how XAFS spectra are efficiently simulated. The FEFF5 code consists of 4 independent modules: a scattering potential and phase shift module, a path finder module, a scattering amplitude module and an XAFS module. Multiple scattering Debye-Waller factors are built in using a correlated Debye model.

  8. Enforcing the International Code of Marketing of Breast-milk Substitutes for Better Promotion of Exclusive Breastfeeding: Can Lessons Be Learned?

    PubMed

    Barennes, Hubert; Slesak, Guenther; Goyet, Sophie; Aaron, Percy; Srour, Leila M

    2016-02-01

    Exclusive breastfeeding, one of the best natural resources, needs protection and promotion. The International Code of Marketing of Breast-milk Substitutes (the Code), which aims to prevent the undermining of breastfeeding by formula advertising, faces implementation challenges. We reviewed frequently overlooked challenges and obstacles that the Code is facing worldwide, but particularly in Southeast Asia. Drawing lessons from various countries where we work, and following the example of successful public health interventions, we discussed legislation, enforcement, and experiences that are needed to successfully implement the Code. Successful holistic approaches that have strengthened the Code need to be scaled up. Community-based actions and peer-to-peer promotions have proved successful. Legislation without stringent enforcement and sufficient penalties is ineffective. The public needs education about the benefits and ways and means to support breastfeeding. It is crucial to combine strong political commitment and leadership with strict national regulations, definitions, and enforcement. National breastfeeding committees, with the authority to improve regulations, investigate violations, and enforce the laws, must be established. Systematic monitoring and reporting are needed to identify companies, individuals, intermediaries, and practices that infringe on the Code. Penalizing violators is crucial. Managers of multinational companies must be held accountable for international violations, and international legislative enforcement needs to be established. Further measures should include improved regulations to protect the breastfeeding mother: large-scale education campaigns; strong penalties for Code violators; exclusion of the formula industry from nutrition, education, and policy roles; supportive legal networks; and independent research of interventions supporting breastfeeding. PMID:26416439

  9. Enforcing the International Code of Marketing of Breast-milk Substitutes for Better Promotion of Exclusive Breastfeeding: Can Lessons Be Learned?

    PubMed

    Barennes, Hubert; Slesak, Guenther; Goyet, Sophie; Aaron, Percy; Srour, Leila M

    2016-02-01

    Exclusive breastfeeding, one of the best natural resources, needs protection and promotion. The International Code of Marketing of Breast-milk Substitutes (the Code), which aims to prevent the undermining of breastfeeding by formula advertising, faces implementation challenges. We reviewed frequently overlooked challenges and obstacles that the Code is facing worldwide, but particularly in Southeast Asia. Drawing lessons from various countries where we work, and following the example of successful public health interventions, we discussed legislation, enforcement, and experiences that are needed to successfully implement the Code. Successful holistic approaches that have strengthened the Code need to be scaled up. Community-based actions and peer-to-peer promotions have proved successful. Legislation without stringent enforcement and sufficient penalties is ineffective. The public needs education about the benefits and ways and means to support breastfeeding. It is crucial to combine strong political commitment and leadership with strict national regulations, definitions, and enforcement. National breastfeeding committees, with the authority to improve regulations, investigate violations, and enforce the laws, must be established. Systematic monitoring and reporting are needed to identify companies, individuals, intermediaries, and practices that infringe on the Code. Penalizing violators is crucial. Managers of multinational companies must be held accountable for international violations, and international legislative enforcement needs to be established. Further measures should include improved regulations to protect the breastfeeding mother: large-scale education campaigns; strong penalties for Code violators; exclusion of the formula industry from nutrition, education, and policy roles; supportive legal networks; and independent research of interventions supporting breastfeeding.

  10. A neural network model of general olfactory coding in the insect antennal lobe.

    PubMed

    Getz, W M; Lutz, A

    1999-08-01

    A central problem in olfaction is understanding how the quality of olfactory stimuli is encoded in the insect antennal lobe (or in the analogously structured vertebrate olfactory bulb) for perceptual processing in the mushroom bodies of the insect protocerebrum (or in the vertebrate olfactory cortex). In the study reported here, a relatively simple neural network model, inspired by our current knowledge of the insect antennal lobes, is used to investigate how each of several features and elements of the network, such as synapse strengths, feedback circuits and the steepness of neural activation functions, influences the formation of an olfactory code in neurons that project from the antennal lobes to the mushroom bodies (or from mitral cells to olfactory cortex). An optimal code in these projection neurons (PNs) should minimize potential errors by the mushroom bodies in misidentifying the quality of an odor across a range of concentrations while maximizing the ability of the mushroom bodies to resolve odors of different quality. Simulation studies demonstrate that the network is able to produce codes independent or virtually independent of concentration over a given range. The extent of this range is moderately dependent on a parameter that characterizes how long it takes for the voltage in an activated neuron to decay back to its resting potential, strongly dependent on the strength of excitatory feedback by the PNs onto antennal lobe intrinsic neurons (INs), and overwhelmingly dependent on the slope of the activation function that transforms the voltage of depolarized neurons into the rate at which spikes are produced. Although the code in the PNs is degraded by large variations in the concentration of odor stimuli, good performance levels are maintained when the complexity of stimuli, as measured by the number of component odorants, is doubled. When excitatory feedback from the PNs to the INs is strong, the activity in the PNs undergoes transitions from initial

  11. Binary coding for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Chang, Chein-I.; Chang, Chein-Chi; Lin, Chinsu

    2004-10-01

    Binary coding is one of simplest ways to characterize spectral features. One commonly used method is a binary coding-based image software system, called Spectral Analysis Manager (SPAM) for remotely sensed imagery developed by Mazer et al. For a given spectral signature, the SPAM calculates its spectral mean and inter-band spectral difference and uses them as thresholds to generate a binary code word for this particular spectral signature. Such coding scheme is generally effective and also very simple to implement. This paper revisits the SPAM and further develops three new SPAM-based binary coding methods, called equal probability partition (EPP) binary coding, halfway partition (HP) binary coding and median partition (MP) binary coding. These three binary coding methods along with the SPAM well be evaluated for spectral discrimination and identification. In doing so, a new criterion, called a posteriori discrimination probability (APDP) is also introduced for performance measure.

  12. TACO: a finite element heat transfer code

    SciTech Connect

    Mason, W.E. Jr.

    1980-02-01

    TACO is a two-dimensional implicit finite element code for heat transfer analysis. It can perform both linear and nonlinear analyses and can be used to solve either transient or steady state problems. Either plane or axisymmetric geometries can be analyzed. TACO has the capability to handle time or temperature dependent material properties and materials may be either isotropic or orthotropic. A variety of time and temperature dependent loadings and boundary conditions are available including temperature, flux, convection, and radiation boundary conditions and internal heat generation. Additionally, TACO has some specialized features such as internal surface conditions (e.g., contact resistance), bulk nodes, enclosure radiation with view factor calculations, and chemical reactive kinetics. A user subprogram feature allows for any type of functional representation of any independent variable. A bandwidth and profile minimization option is also available in the code. Graphical representation of data generated by TACO is provided by a companion post-processor named POSTACO. The theory on which TACO is based is outlined, the capabilities of the code are explained, the input data required to perform an analysis with TACO are described. Some simple examples are provided to illustrate the use of the code.

  13. High-Fidelity Coding with Correlated Neurons

    PubMed Central

    da Silveira, Rava Azeredo; Berry, Michael J.

    2014-01-01

    Positive correlations in the activity of neurons are widely observed in the brain. Previous studies have shown these correlations to be detrimental to the fidelity of population codes, or at best marginally favorable compared to independent codes. Here, we show that positive correlations can enhance coding performance by astronomical factors. Specifically, the probability of discrimination error can be suppressed by many orders of magnitude. Likewise, the number of stimuli encoded—the capacity—can be enhanced more than tenfold. These effects do not necessitate unrealistic correlation values, and can occur for populations with a few tens of neurons. We further show that both effects benefit from heterogeneity commonly seen in population activity. Error suppression and capacity enhancement rest upon a pattern of correlation. Tuning of one or several effective parameters can yield a limit of perfect coding: the corresponding pattern of positive correlation leads to a ‘lock-in’ of response probabilities that eliminates variability in the subspace relevant for stimulus discrimination. We discuss the nature of this pattern and we suggest experimental tests to identify it. PMID:25412463

  14. Multiphysics Code Demonstrated for Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Melis, Matthew E.

    1998-01-01

    The utility of multidisciplinary analysis tools for aeropropulsion applications is being investigated at the NASA Lewis Research Center. The goal of this project is to apply Spectrum, a multiphysics code developed by Centric Engineering Systems, Inc., to simulate multidisciplinary effects in turbomachinery components. Many engineering problems today involve detailed computer analyses to predict the thermal, aerodynamic, and structural response of a mechanical system as it undergoes service loading. Analysis of aerospace structures generally requires attention in all three disciplinary areas to adequately predict component service behavior, and in many cases, the results from one discipline substantially affect the outcome of the other two. There are numerous computer codes currently available in the engineering community to perform such analyses in each of these disciplines. Many of these codes are developed and used in-house by a given organization, and many are commercially available. However, few, if any, of these codes are designed specifically for multidisciplinary analyses. The Spectrum code has been developed for performing fully coupled fluid, thermal, and structural analyses on a mechanical system with a single simulation that accounts for all simultaneous interactions, thus eliminating the requirement for running a large number of sequential, separate, disciplinary analyses. The Spectrum code has a true multiphysics analysis capability, which improves analysis efficiency as well as accuracy. Centric Engineering, Inc., working with a team of Lewis and AlliedSignal Engines engineers, has been evaluating Spectrum for a variety of propulsion applications including disk quenching, drum cavity flow, aeromechanical simulations, and a centrifugal compressor flow simulation.

  15. Allocentric coding: spatial range and combination rules.

    PubMed

    Camors, D; Jouffrais, C; Cottereau, B R; Durand, J B

    2015-04-01

    When a visual target is presented with neighboring landmarks, its location can be determined both relative to the self (egocentric coding) and relative to these landmarks (allocentric coding). In the present study, we investigated (1) how allocentric coding depends on the distance between the targets and their surrounding landmarks (i.e. the spatial range) and (2) how allocentric and egocentric coding interact with each other across targets-landmarks distances (i.e. the combination rules). Subjects performed a memory-based pointing task toward previously gazed targets briefly superimposed (200ms) on background images of cluttered city landscapes. A variable portion of the images was occluded in order to control the distance between the targets and the closest potential landmarks within those images. The pointing responses were performed after large saccades and the reappearance of the images at their initial location. However, in some trials, the images' elements were slightly shifted (±3°) in order to introduce a subliminal conflict between the allocentric and egocentric reference frames. The influence of allocentric coding in the pointing responses was found to decrease with increasing target-landmarks distances, although it remained significant even at the largest distances (⩾10°). Interestingly, both the decreasing influence of allocentric coding and the concomitant increase in pointing responses variability were well captured by a Bayesian model in which the weighted combination of allocentric and egocentric cues is governed by a coupling prior. PMID:25749676

  16. Finite Element Analysis Code

    2006-03-08

    MAPVAR-KD is designed to transfer solution results from one finite element mesh to another. MAPVAR-KD draws heavily from the structure and coding of MERLIN II, but it employs a new finite element data base, EXODUS II, and offers enhanced speed and new capabilities not available in MERLIN II. In keeping with the MERLIN II documentation, the computational algorithms used in MAPVAR-KD are described. User instructions are presented. Example problems are included to demonstrate the operationmore » of the code and the effects of various input options. MAPVAR-KD is a modification of MAPVAR in which the search algorithm was replaced by a kd-tree-based search for better performance on large problems.« less

  17. The NIMROD Code

    NASA Astrophysics Data System (ADS)

    Schnack, D. D.; Glasser, A. H.

    1996-11-01

    NIMROD is a new code system that is being developed for the analysis of modern fusion experiments. It is being designed from the beginning to make the maximum use of massively parallel computer architectures and computer graphics. The NIMROD physics kernel solves the three-dimensional, time-dependent two-fluid equations with neo-classical effects in toroidal geometry of arbitrary poloidal cross section. The NIMROD system also includes a pre-processor, a grid generator, and a post processor. User interaction with NIMROD is facilitated by a modern graphical user interface (GUI). The NIMROD project is using Quality Function Deployment (QFD) team management techniques to minimize re-engineering and reduce code development time. This paper gives an overview of the NIMROD project. Operation of the GUI is demonstrated, and the first results from the physics kernel are given.

  18. Finite Element Analysis Code

    SciTech Connect

    Sjaardema, G.; Wellman, G.; Gartling, D.

    2006-03-08

    MAPVAR-KD is designed to transfer solution results from one finite element mesh to another. MAPVAR-KD draws heavily from the structure and coding of MERLIN II, but it employs a new finite element data base, EXODUS II, and offers enhanced speed and new capabilities not available in MERLIN II. In keeping with the MERLIN II documentation, the computational algorithms used in MAPVAR-KD are described. User instructions are presented. Example problems are included to demonstrate the operation of the code and the effects of various input options. MAPVAR-KD is a modification of MAPVAR in which the search algorithm was replaced by a kd-tree-based search for better performance on large problems.

  19. Confocal coded aperture imaging

    DOEpatents

    Tobin, Jr., Kenneth William; Thomas, Jr., Clarence E.

    2001-01-01

    A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

  20. Sinusoidal transform coding

    NASA Technical Reports Server (NTRS)

    Mcaulay, Robert J.; Quatieri, Thomas F.

    1988-01-01

    It has been shown that an analysis/synthesis system based on a sinusoidal representation of speech leads to synthetic speech that is essentially perceptually indistinguishable from the original. Strategies for coding the amplitudes, frequencies and phases of the sine waves have been developed that have led to a multirate coder operating at rates from 2400 to 9600 bps. The encoded speech is highly intelligible at all rates with a uniformly improving quality as the data rate is increased. A real-time fixed-point implementation has been developed using two ADSP2100 DSP chips. The methods used for coding and quantizing the sine-wave parameters for operation at the various frame rates are described.

  1. Cotranslational signal-independent SRP preloading during membrane targeting.

    PubMed

    Chartron, Justin W; Hunt, Katherine C L; Frydman, Judith

    2016-08-11

    Ribosome-associated factors must properly decode the limited information available in nascent polypeptides to direct them to their correct cellular fate. It is unclear how the low complexity information exposed by the nascent chain suffices for accurate recognition by the many factors competing for the limited surface near the ribosomal exit site. Questions remain even for the well-studied cotranslational targeting cycle to the endoplasmic reticulum, involving recognition of linear hydrophobic signal sequences or transmembrane domains by the signal recognition particle (SRP). Notably, the SRP has low abundance relative to the large number of ribosome-nascent-chain complexes (RNCs), yet it accurately selects those destined for the endoplasmic reticulum. Despite their overlapping specificities, the SRP and the cotranslationally acting Hsp70 display precise mutually exclusive selectivity in vivo for their cognate RNCs. To understand cotranslational nascent chain recognition in vivo, here we investigate the cotranslational membrane-targeting cycle using ribosome profiling in yeast cells coupled with biochemical fractionation of ribosome populations. We show that the SRP preferentially binds secretory RNCs before their targeting signals are translated. Non-coding mRNA elements can promote this signal-independent pre-recruitment of SRP. Our study defines the complex kinetic interaction between elongation in the cytosol and determinants in the polypeptide and mRNA that modulate SRP–substrate selection and membrane targeting. PMID:27487213

  2. Finite Element Analysis Code

    SciTech Connect

    Forsythe, C.; Smith, M.; Sjaardema, G.

    2005-06-26

    Exotxt is an analysis code that reads finite element results data stored in an exodusII file and generates a file in a structured text format. The text file can be edited or modified via a number of text formatting tools. Exotxt is used by analysis to translate data from the binary exodusII format into a structured text format which can then be edited or modified and then either translated back to exodusII format or to another format.

  3. Status of MARS Code

    SciTech Connect

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  4. Reeds computer code

    NASA Technical Reports Server (NTRS)

    Bjork, C.

    1981-01-01

    The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO.

  5. An Eye-Tracking Study of How Color Coding Affects Multimedia Learning

    ERIC Educational Resources Information Center

    Ozcelik, Erol; Karakus, Turkan; Kursun, Engin; Cagiltay, Kursat

    2009-01-01

    Color coding has been proposed to promote more effective learning. However, insufficient evidence currently exists to show how color coding leads to better learning. The goal of this study was to investigate the underlying cause of the color coding effect by utilizing eye movement data. Fifty-two participants studied either a color-coded or…

  6. Bar coded retroreflective target

    SciTech Connect

    Vann, C.S.

    2000-01-25

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  7. MELCOR computer code manuals

    SciTech Connect

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  8. Bar coded retroreflective target

    DOEpatents

    Vann, Charles S.

    2000-01-01

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  9. Orthopedics coding and funding.

    PubMed

    Baron, S; Duclos, C; Thoreux, P

    2014-02-01

    The French tarification à l'activité (T2A) prospective payment system is a financial system in which a health-care institution's resources are based on performed activity. Activity is described via the PMSI medical information system (programme de médicalisation du système d'information). The PMSI classifies hospital cases by clinical and economic categories known as diagnosis-related groups (DRG), each with an associated price tag. Coding a hospital case involves giving as realistic a description as possible so as to categorize it in the right DRG and thus ensure appropriate payment. For this, it is essential to understand what determines the pricing of inpatient stay: namely, the code for the surgical procedure, the patient's principal diagnosis (reason for admission), codes for comorbidities (everything that adds to management burden), and the management of the length of inpatient stay. The PMSI is used to analyze the institution's activity and dynamism: change on previous year, relation to target, and comparison with competing institutions based on indicators such as the mean length of stay performance indicator (MLS PI). The T2A system improves overall care efficiency. Quality of care, however, is not presently taken account of in the payment made to the institution, as there are no indicators for this; work needs to be done on this topic.

  10. INVESTIGATION OF FISCALLY INDEPENDENT AND DEPENDENT CITY SCHOOL DISTRICTS.

    ERIC Educational Resources Information Center

    GITTELL, MARILYN; AND OTHERS

    A TWO-PART COMPARATIVE ANALYSIS IS MADE OF LARGE AND SMALL CITY SCHOOL SYSTEMS. PART I ANALYZES A WIDE RANGE OF FISCAL AND NON-FISCAL VARIABLES ASSOCIATED WITH FISCAL STATUS OF CITY SCHOOL SYSTEMS. IT COVERS THE 2,788 CITY SCHOOL DISTRICTS IN THE UNITED STATES WITH ENROLLMENTS OVER 3,000. COMPLEX INTERRELATIONSHIPS SURROUNDING FISCAL STATUS IN…

  11. Structural coding versus free-energy predictive coding.

    PubMed

    van der Helm, Peter A

    2016-06-01

    Focusing on visual perceptual organization, this article contrasts the free-energy (FE) version of predictive coding (a recent Bayesian approach) to structural coding (a long-standing representational approach). Both use free-energy minimization as metaphor for processing in the brain, but their formal elaborations of this metaphor are fundamentally different. FE predictive coding formalizes it by minimization of prediction errors, whereas structural coding formalizes it by minimization of the descriptive complexity of predictions. Here, both sides are evaluated. A conclusion regarding competence is that FE predictive coding uses a powerful modeling technique, but that structural coding has more explanatory power. A conclusion regarding performance is that FE predictive coding-though more detailed in its account of neurophysiological data-provides a less compelling cognitive architecture than that of structural coding, which, for instance, supplies formal support for the computationally powerful role it attributes to neuronal synchronization.

  12. Computer-Based Coding of Occupation Codes for Epidemiological Analyses.

    PubMed

    Russ, Daniel E; Ho, Kwan-Yuet; Johnson, Calvin A; Friesen, Melissa C

    2014-05-01

    Mapping job titles to standardized occupation classification (SOC) codes is an important step in evaluating changes in health risks over time as measured in inspection databases. However, manual SOC coding is cost prohibitive for very large studies. Computer based SOC coding systems can improve the efficiency of incorporating occupational risk factors into large-scale epidemiological studies. We present a novel method of mapping verbatim job titles to SOC codes using a large table of prior knowledge available in the public domain that included detailed description of the tasks and activities and their synonyms relevant to each SOC code. Job titles are compared to our knowledge base to find the closest matching SOC code. A soft Jaccard index is used to measure the similarity between a previously unseen job title and the knowledge base. Additional information such as standardized industrial codes can be incorporated to improve the SOC code determination by providing additional context to break ties in matches. PMID:25221787

  13. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  14. Independent Learning Models: A Comparison.

    ERIC Educational Resources Information Center

    Wickett, R. E. Y.

    Five models of independent learning are suitable for use in adult education programs. The common factor is a facilitator who works in some way with the student in the learning process. They display different characteristics, including the extent of independence in relation to content and/or process. Nondirective tutorial instruction and learning…

  15. Parental Beliefs about Emotions Are Associated with Early Adolescents' Independent and Interdependent Self-Construals

    ERIC Educational Resources Information Center

    Her, Pa; Dunsmore, Julie C.

    2011-01-01

    We assessed linkages between parents' beliefs and their children's self-construals with 60 7th and 8th graders. Early adolescents completed an open-ended, Self-Guide Questionnaire and an independent and interdependent reaction-time measure. The self-guide responses were coded for independent and interdependent traits. Parents reported beliefs…

  16. Neural Coding for Effective Rehabilitation

    PubMed Central

    2014-01-01

    Successful neurological rehabilitation depends on accurate diagnosis, effective treatment, and quantitative evaluation. Neural coding, a technology for interpretation of functional and structural information of the nervous system, has contributed to the advancements in neuroimaging, brain-machine interface (BMI), and design of training devices for rehabilitation purposes. In this review, we summarized the latest breakthroughs in neuroimaging from microscale to macroscale levels with potential diagnostic applications for rehabilitation. We also reviewed the achievements in electrocorticography (ECoG) coding with both animal models and human beings for BMI design, electromyography (EMG) interpretation for interaction with external robotic systems, and robot-assisted quantitative evaluation on the progress of rehabilitation programs. Future rehabilitation would be more home-based, automatic, and self-served by patients. Further investigations and breakthroughs are mainly needed in aspects of improving the computational efficiency in neuroimaging and multichannel ECoG by selection of localized neuroinformatics, validation of the effectiveness in BMI guided rehabilitation programs, and simplification of the system operation in training devices. PMID:25258708

  17. New quantum MDS-convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Li, Fengwei; Yue, Qin

    2015-12-01

    In this paper, we utilize a family of Hermitian dual-containing constacyclic codes to construct classical and quantum MDS convolutional codes. Our classical and quantum convolutional codes are optimal in the sense that they attain the classical (quantum) generalized Singleton bound.

  18. COLD-SAT Dynamic Model Computer Code

    NASA Technical Reports Server (NTRS)

    Bollenbacher, G.; Adams, N. S.

    1995-01-01

    COLD-SAT Dynamic Model (CSDM) computer code implements six-degree-of-freedom, rigid-body mathematical model for simulation of spacecraft in orbit around Earth. Investigates flow dynamics and thermodynamics of subcritical cryogenic fluids in microgravity. Consists of three parts: translation model, rotation model, and slosh model. Written in FORTRAN 77.

  19. Bilingual Dual Coding in Japanese Returnee Students.

    ERIC Educational Resources Information Center

    Taura, Hideyuki

    1998-01-01

    Investigates effects of second-language acquisition age, length of exposure to the second language, and Japanese language specificity on the bilingual dual coding hypothesis proposed by Paivio and Desrochers (1980). Balanced Japanese-English bilingual returnee (having resided in an English-speaking country) subjects were presented with pictures to…

  20. A Comparative Study on Seismic Analysis of Bangladesh National Building Code (BNBC) with Other Building Codes

    NASA Astrophysics Data System (ADS)

    Bari, Md. S.; Das, T.

    2013-09-01

    Tectonic framework of Bangladesh and adjoining areas indicate that Bangladesh lies well within an active seismic zone. The after effect of earthquake is more severe in an underdeveloped and a densely populated country like ours than any other developed countries. Bangladesh National Building Code (BNBC) was first established in 1993 to provide guidelines for design and construction of new structure subject to earthquake ground motions in order to minimize the risk to life for all structures. A revision of BNBC 1993 is undergoing to make this up to date with other international building codes. This paper aims at the comparison of various provisions of seismic analysis as given in building codes of different countries. This comparison will give an idea regarding where our country stands when it comes to safety against earth quake. Primarily, various seismic parameters in BNBC 2010 (draft) have been studied and compared with that of BNBC 1993. Later, both 1993 and 2010 edition of BNBC codes have been compared graphically with building codes of other countries such as National Building Code of India 2005 (NBC-India 2005), American Society of Civil Engineering 7-05 (ASCE 7-05). The base shear/weight ratios have been plotted against the height of the building. The investigation in this paper reveals that BNBC 1993 has the least base shear among all the codes. Factored Base shear values of BNBC 2010 are found to have increased significantly than that of BNBC 1993 for low rise buildings (≤20 m) around the country than its predecessor. Despite revision of the code, BNBC 2010 (draft) still suggests less base shear values when compared to the Indian and American code. Therefore, this increase in factor of safety against the earthquake imposed by the proposed BNBC 2010 code by suggesting higher values of base shear is appreciable.

  1. Rare coding mutations identified by sequencing of Alzheimer disease genome‐wide association studies loci

    PubMed Central

    Vardarajan, Badri N.; Ghani, Mahdi; Kahn, Amanda; Sheikh, Stephanie; Sato, Christine; Barral, Sandra; Lee, Joseph H.; Cheng, Rong; Reitz, Christiane; Lantigua, Rafael; Reyes‐Dumeyer, Dolly; Medrano, Martin; Jimenez‐Velazquez, Ivonne Z.; Rogaeva, Ekaterina; St George‐Hyslop, Peter

    2015-01-01

    Objective To detect rare coding variants underlying loci detected by genome‐wide association studies (GWAS) of late onset Alzheimer disease (LOAD). Methods We conducted targeted sequencing of ABCA7, BIN1, CD2AP, CLU, CR1, EPHA1, MS4A4A/MS4A6A, and PICALM in 3 independent LOAD cohorts: 176 patients from 124 Caribbean Hispanics families, 120 patients and 33 unaffected individuals from the 129 National Institute on Aging LOAD Family Study; and 263 unrelated Canadian individuals of European ancestry (210 sporadic patients and 53 controls). Rare coding variants found in at least 2 data sets were genotyped in independent groups of ancestry‐matched controls. Additionally, the Exome Aggregation Consortium was used as a reference data set for population‐based allele frequencies. Results Overall we detected a statistically significant 3.1‐fold enrichment of the nonsynonymous mutations in the Caucasian LOAD cases compared with controls (p = 0.002) and no difference in synonymous variants. A stop‐gain mutation in ABCA7 (E1679X) and missense mutation in CD2AP (K633R) were highly significant in Caucasian LOAD cases, and mutations in EPHA1 (P460L) and BIN1 (K358R) were significant in Caribbean Hispanic families with LOAD. The EPHA1 variant segregated completely in an extended Caribbean Hispanic family and was also nominally significant in the Caucasians. Additionally, BIN1 (K358R) segregated in 2 of the 6 Caribbean Hispanic families where the mutations were discovered. Interpretation Targeted sequencing of confirmed GWAS loci revealed an excess burden of deleterious coding mutations in LOAD, with the greatest burden observed in ABCA7 and BIN1. Identifying coding variants in LOAD will facilitate the creation of tractable models for investigation of disease‐related mechanisms and potential therapies. Ann Neurol 2015;78:487–498 PMID:26101835

  2. Space-independent xenon oscillations revisited

    SciTech Connect

    Rizwan-uddin )

    1989-01-01

    Recently, various branches of engineering and science have seen a rapid increase in the number of dynamical analyses undertaken. This modern phenomenon often obscures the fact that such analyses were sometimes carried out even before the current trend began. Moreover, these earlier analyses, which even now seem very ingenuous, were carried out at a time when the available information about dynamical systems was not as well disseminated as it is today. One such analysis, carried out in the early 1960s, showed the existence of stable limit cycles in a simple model for space-independent xenon dynamics in nuclear reactors. The authors, apparently unaware of the now well-known bifurcation theorem by Hopf, could not numerically discover unstable limit cycles, though they did find regions in parameter space where the fixed points are stable for small perturbations but unstable for very large perturbations. The analysis was carried out both analytically and numerically. As a tribute to these early nonlinear dynamicists in the field of nuclear engineering, in this paper, the Hopf theorem and its conclusions are briefly described, and then the solution of the space-independent xenon oscillation problem is presented, which was obtained using the bifurcation analysis BIFDD code. These solutions are presented along with a discussion of the earlier results.

  3. On decoding of multi-level MPSK modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Gupta, Alok Kumar

    1990-01-01

    The decoding problem of multi-level block modulation codes is investigated. The hardware design of soft-decision Viterbi decoder for some short length 8-PSK block modulation codes is presented. An effective way to reduce the hardware complexity of the decoder by reducing the branch metric and path metric, using a non-uniform floating-point to integer mapping scheme, is proposed and discussed. The simulation results of the design are presented. The multi-stage decoding (MSD) of multi-level modulation codes is also investigated. The cases of soft-decision and hard-decision MSD are considered and their performance are evaluated for several codes of different lengths and different minimum squared Euclidean distances. It is shown that the soft-decision MSD reduces the decoding complexity drastically and it is suboptimum. The hard-decision MSD further simplifies the decoding while still maintaining a reasonable coding gain over the uncoded system, if the component codes are chosen properly. Finally, some basic 3-level 8-PSK modulation codes using BCH codes as component codes are constructed and their coding gains are found for hard decision multistage decoding.

  4. Coarse coding and discourse comprehension in adults with right hemisphere brain damage

    PubMed Central

    Tompkins, Connie A.; Scharp, Victoria L.; Meigh, Kimberly M.; Fassbinder, Wiltrud

    2009-01-01

    Background Various investigators suggest that some discourse-level comprehension difficulties in adults with right hemisphere brain damage (RHD) have a lexical-semantic basis. As words are processed, the intact right hemisphere arouses and sustains activation of a wide-ranging network of secondary or peripheral meanings and features—a phenomenon dubbed “coarse coding”. Coarse coding impairment has been postulated to underpin some prototypical RHD comprehension deficits, such as difficulties with nonliteral language interpretation, discourse integration, some kinds of inference generation, and recovery when a reinterpretation is needed. To date, however, no studies have addressed the hypothesised link between coarse coding deficit and discourse comprehension in RHD. Aims The current investigation examined whether coarse coding was related to performance on two measures of narrative comprehension in adults with RHD. Methods & Procedures Participants were 32 adults with unilateral RHD from cerebrovascular accident, and 38 adults without brain damage. Coarse coding was operationalised as poor activation of peripheral/weakly related semantic features of words. For the coarse coding assessment, participants listened to spoken sentences that ended in a concrete noun. Each sentence was followed by a spoken target phoneme string. Targets were subordinate semantic features of the sentence-final nouns that were incompatible with their dominant mental representations (e.g., “rotten” for apple). Targets were presented at two post-noun intervals. A lexical decision task was used to gauge both early activation and maintenance of activation of these weakly related semantic features. One of the narrative tasks assessed comprehension of implied main ideas and details, while the other indexed high-level inferencing and integration. Both comprehension tasks were presented auditorily. For all tasks, accuracy of performance was the dependent measure. Correlations were computed

  5. An experimental investigation of clocking effects on turbine aerodynamics using a modern 3-D one and one-half stage high pressure turbine for code verification and flow model development

    NASA Astrophysics Data System (ADS)

    Haldeman, Charles Waldo, IV

    2003-10-01

    This research uses a modern 1 and 1/2 stage high-pressure (HP) turbine operating at the proper design corrected speed, pressure ratio, and gas to metal temperature ratio to generate a detailed data set containing aerodynamic, heat-transfer and aero-performance information. The data was generated using the Ohio State University Gas Turbine Laboratory Turbine Test Facility (TTF), which is a short-duration shock tunnel facility. The research program utilizes an uncooled turbine stage for which all three airfoils are heavily instrumented at multiple spans and on the HPV and LPV endwalls and HPB platform and tips. Heat-flux and pressure data are obtained using the traditional shock-tube and blowdown facility operational modes. Detailed examination show that the aerodynamic (pressure) data obtained in the blowdown mode is the same as obtained in the shock-tube mode when the corrected conditions are matched. Various experimental conditions and configurations were performed, including LPV clocking positions, off-design corrected speed conditions, pressure ratio changes, and Reynolds number changes. The main research for this dissertation is concentrated on the LPV clocking experiments, where the LPV was clocked relative to the HPV at several different passage locations and at different Reynolds numbers. Various methods were used to evaluate the effect of clocking on both the aeroperformance (efficiency) and aerodynamics (pressure loading) on the LPV, including time-resolved measurements, time-averaged measurements and stage performance measurements. A general improvement in overall efficiency of approximately 2% is demonstrated and could be observed using a variety of independent methods. Maximum efficiency is obtained when the time-average pressures are highest on the LPV, and the time-resolved data both in the time domain and frequency domain show the least amount of variation. The gain in aeroperformance is obtained by integrating over the entire airfoil as the three

  6. Qudit color codes and gauge color codes in all spatial dimensions

    NASA Astrophysics Data System (ADS)

    Watson, Fern H. E.; Campbell, Earl T.; Anwar, Hussain; Browne, Dan E.

    2015-08-01

    Two-level quantum systems, qubits, are not the only basis for quantum computation. Advantages exist in using qudits, d -level quantum systems, as the basic carrier of quantum information. We show that color codes, a class of topological quantum codes with remarkable transversality properties, can be generalized to the qudit paradigm. In recent developments it was found that in three spatial dimensions a qubit color code can support a transversal non-Clifford gate and that in higher spatial dimensions additional non-Clifford gates can be found, saturating Bravyi and König's bound [S. Bravyi and R. König, Phys. Rev. Lett. 111, 170502 (2013), 10.1103/PhysRevLett.111.170502]. Furthermore, by using gauge fixing techniques, an effective set of Clifford gates can be achieved, removing the need for state distillation. We show that the qudit color code can support the qudit analogs of these gates and also show that in higher spatial dimensions a color code can support a phase gate from higher levels of the Clifford hierarchy that can be proven to saturate Bravyi and König's bound in all but a finite number of special cases. The methodology used is a generalization of Bravyi and Haah's method of triorthogonal matrices [S. Bravyi and J. Haah, Phys. Rev. A 86, 052329 (2012), 10.1103/PhysRevA.86.052329], which may be of independent interest. For completeness, we show explicitly that the qudit color codes generalize to gauge color codes and share many of the favorable properties of their qubit counterparts.

  7. Biographical factors of occupational independence.

    PubMed

    Müller, G F

    2001-10-01

    The present study examined biographical factors of occupational independence including any kind of nonemployed profession. Participants were 59 occupationally independent and 58 employed persons of different age (M = 36.3 yr.), sex, and profession. They were interviewed on variables like family influence, educational background, occupational role models, and critical events for choosing a particular type of occupational career. The obtained results show that occupationally independent people reported stronger family ties, experienced fewer restrictions of formal education, and remembered fewer negative role models than the employed people. Implications of these results are discussed. PMID:11783553

  8. Approximately Independent Features of Languages

    NASA Astrophysics Data System (ADS)

    Holman, Eric W.

    To facilitate the testing of models for the evolution of languages, the present paper offers a set of linguistic features that are approximately independent of each other. To find these features, the adjusted Rand index (R‧) is used to estimate the degree of pairwise relationship among 130 linguistic features in a large published database. Many of the R‧ values prove to be near zero, as predicted for independent features, and a subset of 47 features is found with an average R‧ of -0.0001. These 47 features are recommended for use in statistical tests that require independent units of analysis.

  9. Sequence independent amplification of DNA

    DOEpatents

    Bohlander, Stefan K.

    1998-01-01

    The present invention is a rapid sequence-independent amplification procedure (SIA). Even minute amounts of DNA from various sources can be amplified independent of any sequence requirements of the DNA or any a priori knowledge of any sequence characteristics of the DNA to be amplified. This method allows, for example the sequence independent amplification of microdissected chromosomal material and the reliable construction of high quality fluorescent in situ hybridization (FISH) probes from YACs or from other sources. These probes can be used to localize YACs on metaphase chromosomes but also--with high efficiency--in interphase nuclei.

  10. Sequence independent amplification of DNA

    DOEpatents

    Bohlander, S.K.

    1998-03-24

    The present invention is a rapid sequence-independent amplification procedure (SIA). Even minute amounts of DNA from various sources can be amplified independent of any sequence requirements of the DNA or any a priori knowledge of any sequence characteristics of the DNA to be amplified. This method allows, for example, the sequence independent amplification of microdissected chromosomal material and the reliable construction of high quality fluorescent in situ hybridization (FISH) probes from YACs or from other sources. These probes can be used to localize YACs on metaphase chromosomes but also--with high efficiency--in interphase nuclei. 25 figs.

  11. The APS SASE FEL : modeling and code comparison.

    SciTech Connect

    Biedron, S. G.

    1999-04-20

    A self-amplified spontaneous emission (SASE) free-electron laser (FEL) is under construction at the Advanced Photon Source (APS). Five FEL simulation codes were used in the design phase: GENESIS, GINGER, MEDUSA, RON, and TDA3D. Initial comparisons between each of these independent formulations show good agreement for the parameters of the APS SASE FEL.

  12. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  13. Energy Codes and Standards: Facilities

    SciTech Connect

    Bartlett, Rosemarie; Halverson, Mark A.; Shankle, Diana L.

    2007-01-01

    Energy codes and standards play a vital role in the marketplace by setting minimum requirements for energy-efficient design and construction. They outline uniform requirements for new buildings as well as additions and renovations. This article covers basic knowledge of codes and standards; development processes of each; adoption, implementation, and enforcement of energy codes and standards; and voluntary energy efficiency programs.

  14. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  15. Reliability of ICD-10 external cause of death codes in the National Coroners Information System.

    PubMed

    Bugeja, Lyndal; Clapperton, Angela J; Killian, Jessica J; Stephan, Karen L; Ozanne-Smith, Joan

    2010-01-01

    Availability of ICD-10 cause of death codes in the National Coroners Information System (NCIS) strengthens its value as a public health surveillance tool. This study quantified the completeness of external cause ICD-10 codes in the NCIS for Victorian deaths (as assigned by the Australian Bureau of Statistics (ABS) in the yearly Cause of Death data). It also examined the concordance between external cause ICD-10 codes contained in the NCIS and a re-code of the same deaths conducted by an independent coder. Of 7,400 NCIS external cause deaths included in this study, 961 (13.0%) did not contain an ABS assigned ICD-10 code and 225 (3.0%) contained only a natural cause code. Where an ABS assigned external cause ICD-10 code was present (n=6,214), 4,397 (70.8%) matched exactly with the independently assigned ICD-10 code. Coding disparity primarily related to differences in assignment of intent and specificity. However, in a small number of deaths (n=49, 0.8%) there was coding disparity for both intent and external cause category. NCIS users should be aware of the limitations of relying only on ICD-10 codes contained within the NCIS for deaths prior to 2007 and consider using these in combination with the other NCIS data fields and code sets to ensure optimum case identification. PMID:21041843

  16. Finite Element Analysis Code

    2005-06-26

    Exotxt is an analysis code that reads finite element results data stored in an exodusII file and generates a file in a structured text format. The text file can be edited or modified via a number of text formatting tools. Exotxt is used by analysis to translate data from the binary exodusII format into a structured text format which can then be edited or modified and then either translated back to exodusII format or tomore » another format.« less

  17. Finite Element Analysis Code

    SciTech Connect

    Sjaardema, G.; Forsythe, C.

    2005-05-07

    CONEX is a code for joining sequentially in time multiple exodusll database files which all represent the same base mesh topology and geometry. It is used to create a single results or restart file from multiple results or restart files which typically arise as the result of multiple restarted analyses. CONEX is used to postprocess the results from a series of finite element analyses. It can join sequentially the data from multiple results databases into a single database which makes it easier to postprocess the results data.

  18. Finite Element Analysis Code

    2005-05-07

    CONEX is a code for joining sequentially in time multiple exodusll database files which all represent the same base mesh topology and geometry. It is used to create a single results or restart file from multiple results or restart files which typically arise as the result of multiple restarted analyses. CONEX is used to postprocess the results from a series of finite element analyses. It can join sequentially the data from multiple results databases intomore » a single database which makes it easier to postprocess the results data.« less

  19. New quantum codes constructed from quaternary BCH codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  20. Low Density Parity Check Codes: Bandwidth Efficient Channel Coding

    NASA Technical Reports Server (NTRS)

    Fong, Wai; Lin, Shu; Maki, Gary; Yeh, Pen-Shu

    2003-01-01

    Low Density Parity Check (LDPC) Codes provide near-Shannon Capacity performance for NASA Missions. These codes have high coding rates R=0.82 and 0.875 with moderate code lengths, n=4096 and 8176. Their decoders have inherently parallel structures which allows for high-speed implementation. Two codes based on Euclidean Geometry (EG) were selected for flight ASIC implementation. These codes are cyclic and quasi-cyclic in nature and therefore have a simple encoder structure. This results in power and size benefits. These codes also have a large minimum distance as much as d,,, = 65 giving them powerful error correcting capabilities and error floors less than lo- BER. This paper will present development of the LDPC flight encoder and decoder, its applications and status.

  1. New quantum codes constructed from quaternary BCH codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-07-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  2. Independent Schools: Landscape and Learnings.

    ERIC Educational Resources Information Center

    Oates, William A.

    1981-01-01

    Examines American independent schools (parochial, southern segregated, and private institutions) in terms of their funding, expenditures, changing enrollment patterns, teacher-student ratios, and societal functions. Journal available from Daedalus Subscription Department, 1172 Commonwealth Ave., Boston, MA 02132. (AM)

  3. Technology for Independent Living: Sourcebook.

    ERIC Educational Resources Information Center

    Enders, Alexandra, Ed.

    This sourcebook provides information for the practical implementation of independent living technology in the everyday rehabilitation process. "Information Services and Resources" lists databases, clearinghouses, networks, research and development programs, toll-free telephone numbers, consumer protection caveats, selected publications, and…

  4. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  5. The impact of time step definition on code convergence and robustness

    NASA Technical Reports Server (NTRS)

    Venkateswaran, S.; Weiss, J. M.; Merkle, C. L.

    1992-01-01

    We have implemented preconditioning for multi-species reacting flows in two independent codes, an implicit (ADI) code developed in-house and the RPLUS code (developed at LeRC). The RPLUS code was modified to work on a four-stage Runge-Kutta scheme. The performance of both the codes was tested, and it was shown that preconditioning can improve convergence by a factor of two to a hundred depending on the problem. Our efforts are currently focused on evaluating the effect of chemical sources and on assessing how preconditioning may be applied to improve convergence and robustness in the calculation of reacting flows.

  6. Independence test for sparse data

    NASA Astrophysics Data System (ADS)

    García, J. E.; González-López, V. A.

    2016-06-01

    In this paper a new non-parametric independence test is presented. García and González-López (2014) [1] introduced the LIS test for the hypothesis of independence between two continuous random variables, the test proposed in this work is a generalization of the LIS test. The new test does not require the assumption of continuity for the random variables, it test is applied to two datasets and also compared with the Pearson's Chi-squared test.

  7. Genetic code for sine

    NASA Astrophysics Data System (ADS)

    Abdullah, Alyasa Gan; Wah, Yap Bee

    2015-02-01

    The computation of the approximate values of the trigonometric sines was discovered by Bhaskara I (c. 600-c.680), a seventh century Indian mathematician and is known as the Bjaskara's I's sine approximation formula. The formula is given in his treatise titled Mahabhaskariya. In the 14th century, Madhava of Sangamagrama, a Kerala mathematician astronomer constructed the table of trigonometric sines of various angles. Madhava's table gives the measure of angles in arcminutes, arcseconds and sixtieths of an arcsecond. The search for more accurate formulas led to the discovery of the power series expansion by Madhava of Sangamagrama (c.1350-c. 1425), the founder of the Kerala school of astronomy and mathematics. In 1715, the Taylor series was introduced by Brook Taylor an English mathematician. If the Taylor series is centered at zero, it is called a Maclaurin series, named after the Scottish mathematician Colin Maclaurin. Some of the important Maclaurin series expansions include trigonometric functions. This paper introduces the genetic code of the sine of an angle without using power series expansion. The genetic code using square root approach reveals the pattern in the signs (plus, minus) and sequence of numbers in the sine of an angle. The square root approach complements the Pythagoras method, provides a better understanding of calculating an angle and will be useful for teaching the concepts of angles in trigonometry.

  8. FAST GYROSYNCHROTRON CODES

    SciTech Connect

    Fleishman, Gregory D.; Kuznetsov, Alexey A.

    2010-10-01

    Radiation produced by charged particles gyrating in a magnetic field is highly significant in the astrophysics context. Persistently increasing resolution of astrophysical observations calls for corresponding three-dimensional modeling of the radiation. However, available exact equations are prohibitively slow in computing a comprehensive table of high-resolution models required for many practical applications. To remedy this situation, we develop approximate gyrosynchrotron (GS) codes capable of quickly calculating the GS emission (in non-quantum regime) from both isotropic and anisotropic electron distributions in non-relativistic, mildly relativistic, and ultrarelativistic energy domains applicable throughout a broad range of source parameters including dense or tenuous plasmas and weak or strong magnetic fields. The computation time is reduced by several orders of magnitude compared with the exact GS algorithm. The new algorithm performance can gradually be adjusted to the user's needs depending on whether precision or computation speed is to be optimized for a given model. The codes are made available for users as a supplement to this paper.

  9. New optimal quantum convolutional codes

    NASA Astrophysics Data System (ADS)

    Zhu, Shixin; Wang, Liqi; Kai, Xiaoshan

    2015-04-01

    One of the most challenges to prove the feasibility of quantum computers is to protect the quantum nature of information. Quantum convolutional codes are aimed at protecting a stream of quantum information in a long distance communication, which are the correct generalization to the quantum domain of their classical analogs. In this paper, we construct some classes of quantum convolutional codes by employing classical constacyclic codes. These codes are optimal in the sense that they attain the Singleton bound for pure convolutional stabilizer codes.

  10. Convolutional code performance in planetary entry channels

    NASA Technical Reports Server (NTRS)

    Modestino, J. W.

    1974-01-01

    The planetary entry channel is modeled for communication purposes representing turbulent atmospheric scattering effects. The performance of short and long constraint length convolutional codes is investigated in conjunction with coherent BPSK modulation and Viterbi maximum likelihood decoding. Algorithms for sequential decoding are studied in terms of computation and/or storage requirements as a function of the fading channel parameters. The performance of the coded coherent BPSK system is compared with the coded incoherent MFSK system. Results indicate that: some degree of interleaving is required to combat time correlated fading of channel; only modest amounts of interleaving are required to approach performance of memoryless channel; additional propagational results are required on the phase perturbation process; and the incoherent MFSK system is superior when phase tracking errors are considered.

  11. Obituary: Arthur Dodd Code (1923-2009)

    NASA Astrophysics Data System (ADS)

    Marché, Jordan D., II

    2009-12-01

    Former AAS president Arthur Dodd Code, age 85, passed away at Meriter Hospital in Madison, Wisconsin on 11 March 2009, from complications involving a long-standing pulmonary condition. Code was born in Brooklyn, New York on 13 August 1923, as the only child of former Canadian businessman Lorne Arthur Code and Jesse (Dodd) Code. An experienced ham radio operator, he entered the University of Chicago in 1940, but then enlisted in the U.S. Navy (1943-45) and was later stationed as an instructor at the Naval Research Laboratory, Washington, D.C. During the war, he gained extensive practical experience with the design and construction of technical equipment that served him well in years ahead. Concurrently, he took physics courses at George Washington University (some under the tutelage of George Gamow). In 1945, he was admitted to the graduate school of the University of Chicago, without having received his formal bachelor's degree. In 1950, he was awarded his Ph.D. for a theoretical study of radiative transfer in O- and B-type stars, directed by Subrahmanyan Chandrasekhar. hired onto the faculty of the Department of Astronomy at the University of Wisconsin-Madison (1951-56). He then accepted a tenured appointment at the California Institute of Technology and the Mount Wilson and Palomar Observatories (1956-58). But following the launch of Sputnik, Code returned to Wisconsin in 1958 as full professor of astronomy, director of the Washburn Observatory, and department chairman so that he could more readily pursue his interest in space astronomy. That same year, he was chosen a member of the Space Science Board of the National Academy of Sciences (created during the International Geophysical Year) and shortly became one of five principal investigators of the original NASA Space Science Working Group. In a cogent 1960 essay, Code argued that astrophysical investigations, when conducted from beyond the Earth's atmosphere, "cannot fail to have a tremendous impact on the

  12. Effective Practice in the Design of Directed Independent Learning Opportunities

    ERIC Educational Resources Information Center

    Thomas, Liz; Jones, Robert; Ottaway, James

    2015-01-01

    This study, commissioned by the HEA and the QAA focuses on directed independent learning practices in UK higher education. It investigates what stakeholders (including academic staff and students) have found to be the most effective practices in the inception, design, quality assurance and enhancement of directed independent learning and explores…

  13. The metaethics of nursing codes of ethics and conduct.

    PubMed

    Snelling, Paul C

    2016-10-01

    Nursing codes of ethics and conduct are features of professional practice across the world, and in the UK, the regulator has recently consulted on and published a new code. Initially part of a professionalising agenda, nursing codes have recently come to represent a managerialist and disciplinary agenda and nursing can no longer be regarded as a self-regulating profession. This paper argues that codes of ethics and codes of conduct are significantly different in form and function similar to the difference between ethics and law in everyday life. Some codes successfully integrate these two functions within the same document, while others, principally the UK Code, conflate them resulting in an ambiguous document unable to fulfil its functions effectively. The paper analyses the differences between ethical-codes and conduct-codes by discussing titles, authorship, level, scope for disagreement, consequences of transgression, language and finally and possibly most importantly agent-centeredness. It is argued that conduct-codes cannot require nurses to be compassionate because compassion involves an emotional response. The concept of kindness provides a plausible alternative for conduct-codes as it is possible to understand it solely in terms of acts. But if kindness is required in conduct-codes, investigation and possible censure follows from its absence. Using examples it is argued that there are at last five possible accounts of the absence of kindness. As well as being potentially problematic for disciplinary panels, difficulty in understanding the features of blameworthy absence of kindness may challenge UK nurses who, following a recently introduced revalidation procedure, are required to reflect on their practice in relation to The Code. It is concluded that closer attention to metaethical concerns by code writers will better support the functions of their issuing organisations.

  14. Generating Code Review Documentation for Auto-Generated Mission-Critical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2009-01-01

    Model-based design and automated code generation are increasingly used at NASA to produce actual flight code, particularly in the Guidance, Navigation, and Control domain. However, since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently auto-generated code still needs to be fully tested and certified. We have thus developed AUTOCERT, a generator-independent plug-in that supports the certification of auto-generated code. AUTOCERT takes a set of mission safety requirements, and formally verifies that the autogenerated code satisfies these requirements. It generates a natural language report that explains why and how the code complies with the specified requirements. The report is hyper-linked to both the program and the verification conditions and thus provides a high-level structured argument containing tracing information for use in code reviews.

  15. Circular codes, symmetries and transformations.

    PubMed

    Fimmel, Elena; Giannerini, Simone; Gonzalez, Diego Luis; Strüngmann, Lutz

    2015-06-01

    Circular codes, putative remnants of primeval comma-free codes, have gained considerable attention in the last years. In fact they represent a second kind of genetic code potentially involved in detecting and maintaining the normal reading frame in protein coding sequences. The discovering of an universal code across species suggested many theoretical and experimental questions. However, there is a key aspect that relates circular codes to symmetries and transformations that remains to a large extent unexplored. In this article we aim at addressing the issue by studying the symmetries and transformations that connect different circular codes. The main result is that the class of 216 C3 maximal self-complementary codes can be partitioned into 27 equivalence classes defined by a particular set of transformations. We show that such transformations can be put in a group theoretic framework with an intuitive geometric interpretation. More general mathematical results about symmetry transformations which are valid for any kind of circular codes are also presented. Our results pave the way to the study of the biological consequences of the mathematical structure behind circular codes and contribute to shed light on the evolutionary steps that led to the observed symmetries of present codes. PMID:25008961

  16. Characterizing Mathematics Classroom Practice: Impact of Observation and Coding Choices

    ERIC Educational Resources Information Center

    Ing, Marsha; Webb, Noreen M.

    2012-01-01

    Large-scale observational measures of classroom practice increasingly focus on opportunities for student participation as an indicator of instructional quality. Each observational measure necessitates making design and coding choices on how to best measure student participation. This study investigated variations of coding approaches that may be…

  17. 32 CFR 636.11 - Installation traffic codes

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 4 2011-07-01 2011-07-01 false Installation traffic codes 636.11 Section 636.11... CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION (SPECIFIC INSTALLATIONS) Fort Stewart, Georgia § 636.11 Installation traffic codes In addition to the requirements in § 634.25(d) of this...

  18. 32 CFR 636.11 - Installation traffic codes

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Installation traffic codes 636.11 Section 636.11... CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION (SPECIFIC INSTALLATIONS) Fort Stewart, Georgia § 636.11 Installation traffic codes In addition to the requirements in § 634.25(d) of this...

  19. Automatic Coding of Dialogue Acts in Collaboration Protocols

    ERIC Educational Resources Information Center

    Erkens, Gijsbert; Janssen, Jeroen

    2008-01-01

    Although protocol analysis can be an important tool for researchers to investigate the process of collaboration and communication, the use of this method of analysis can be time consuming. Hence, an automatic coding procedure for coding dialogue acts was developed. This procedure helps to determine the communicative function of messages in online…

  20. Making your code citable with the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; DuPrie, Kimberly; Schmidt, Judy; Berriman, G. Bruce; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2016-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. With nearly 1,200 codes, it is the largest indexed resource for astronomy codes in existence. Established in 1999, it offers software authors a path to citation of their research codes even without publication of a paper describing the software, and offers scientists a way to find codes used in refereed publications, thus improving the transparency of the research. It also provides a method to quantify the impact of source codes in a fashion similar to the science metrics of journal articles. Citations using ASCL IDs are accepted by major astronomy journals and if formatted properly are tracked by ADS and other indexing services. The number of citations to ASCL entries increased sharply from 110 citations in January 2014 to 456 citations in September 2015. The percentage of code entries in ASCL that were cited at least once rose from 7.5% in January 2014 to 17.4% in September 2015. The ASCL's mid-2014 infrastructure upgrade added an easy entry submission form, more flexible browsing, search capabilities, and an RSS feeder for updates. A Changes/Additions form added this past fall lets authors submit links for papers that use their codes for addition to the ASCL entry even if those papers don't formally cite the codes, thus increasing the transparency of that research and capturing the value of their software to the community.

  1. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  2. Bar code usage in nuclear materials accountability

    SciTech Connect

    Mee, W.T.

    1983-07-01

    The age old method of physically taking an inventory of materials by listing each item's identification number has lived beyond its usefulness. In this age of computerization, which offers the local grocery store a quick, sure, and easy means to inventory, it is time for nuclear materials facilities to automate accountability activities. The Oak Ridge Y-12 Plant began investigating the use of automated data collection devices in 1979. At that time, bar code and optical-character-recognition (OCR) systems were reviewed with the purpose of directly entering data into DYMCAS (Dynamic Special Nuclear Materials Control and Accountability System). Both of these systems appeared applicable; however, other automated devices already employed for production control made implementing the bar code and OCR seem improbable. However, the DYMCAS was placed on line for nuclear material accountability, a decision was made to consider the bar code for physical inventory listings. For the past several months a development program has been underway to use a bar code device to collect and input data to the DYMCAS on the uranium recovery operations. Programs have been completed and tested, and are being employed to ensure that data will be compatible and useful. Bar code implementation and expansion of its use for all nuclear material inventory activity in Y-12 is presented.

  3. Bar code usage in nuclear materials accountability

    SciTech Connect

    Mee, W.T.

    1983-01-01

    The Oak Ridge Y-12 Plant began investigating the use of automated data collection devices in 1979. At this time, bar code and optical-character-recognition (OCR) systems were reviewed with the purpose of directly entering data into DYMCAS (Dynamic Special Nuclear Materials Control and Accountability System). Both of these systems appeared applicable, however, other automated devices already employed for production control made implementing the bar code and OCR seem improbable. However, the DYMCAS was placed on line for nuclear material accountability, a decision was made to consider the bar code for physical inventory listings. For the past several months a development program has been underway to use a bar code device to collect and input data to the DYMCAS on the uranium recovery operations. Programs have been completed and tested, and are being employed to ensure that data will be compatible and useful. Bar code implementation and expansion of its use for all nuclear material inventory activity in Y-12 is presented.

  4. Three-dimensional subband coding of video.

    PubMed

    Podilchuk, C I; Jayant, N S; Farvardin, N

    1995-01-01

    We describe and show the results of video coding based on a three-dimensional (3-D) spatio-temporal subband decomposition. The results include a 1-Mbps coder based on a new adaptive differential pulse code modulation scheme (ADPCM) and adaptive bit allocation. This rate is useful for video storage on CD-ROM. Coding results are also shown for a 384-kbps rate that are based on ADPCM for the lowest frequency band and a new form of vector quantization (geometric vector quantization (GVQ)) for the data in the higher frequency bands. GVQ takes advantage of the inherent structure and sparseness of the data in the higher bands. Results are also shown for a 128-kbps coder that is based on an unbalanced tree-structured vector quantizer (UTSVQ) for the lowest frequency band and GVQ for the higher frequency bands. The results are competitive with traditional video coding techniques and provide the motivation for investigating the 3-D subband framework for different coding schemes and various applications. PMID:18289965

  5. A Coding System for the Study of Linguistic Variation in Black English.

    ERIC Educational Resources Information Center

    Pfaff, Carol W.

    This paper documents a coding system developed to facilitate the investigation of linguistic variation in Black English. The rationale for employment of such a system is given. The use of the coding system in a study of child Black English is described and the codes for 41 phonological and syntactic variables investigated in the study are…

  6. Mean-based neural coding of voices.

    PubMed

    Andics, Attila; McQueen, James M; Petersson, Karl Magnus

    2013-10-01

    The social significance of recognizing the person who talks to us is obvious, but the neural mechanisms that mediate talker identification are unclear. Regions along the bilateral superior temporal sulcus (STS) and the inferior frontal cortex (IFC) of the human brain are selective for voices, and they are sensitive to rapid voice changes. Although it has been proposed that voice recognition is supported by prototype-centered voice representations, the involvement of these category-selective cortical regions in the neural coding of such "mean voices" has not previously been demonstrated. Using fMRI in combination with a voice identity learning paradigm, we show that voice-selective regions are involved in the mean-based coding of voice identities. Voice typicality is encoded on a supra-individual level in the right STS along a stimulus-dependent, identity-independent (i.e., voice-acoustic) dimension, and on an intra-individual level in the right IFC along a stimulus-independent, identity-dependent (i.e., voice identity) dimension. Voice recognition therefore entails at least two anatomically separable stages, each characterized by neural mechanisms that reference the central tendencies of voice categories. PMID:23664949

  7. Determination of problematic ICD-9-CM subcategories for further study of coding performance: Delphi method.

    PubMed

    Zeng, Xiaoming; Bell, Paul D

    2011-01-01

    In this study, we report on a qualitative method known as the Delphi method, used in the first part of a research study for improving the accuracy and reliability of ICD-9-CM coding. A panel of independent coding experts interacted methodically to determine that the three criteria to identify a problematic ICD-9-CM subcategory for further study were cost, volume, and level of coding confusion caused. The Medicare Provider Analysis and Review (MEDPAR) 2007 fiscal year data set as well as suggestions from the experts were used to identify coding subcategories based on cost and volume data. Next, the panelists performed two rounds of independent ranking before identifying Excisional Debridement as the subcategory that causes the most confusion among coders. As a result, they recommended it for further study aimed at improving coding accuracy and variation. This framework can be adopted at different levels for similar studies in need of a schema for determining problematic subcategories of code sets.

  8. Peripheral coding of taste

    PubMed Central

    Liman, Emily R.; Zhang, Yali V.; Montell, Craig

    2014-01-01

    Five canonical tastes, bitter, sweet, umami (amino acid), salty and sour (acid) are detected by animals as diverse as fruit flies and humans, consistent with a near universal drive to consume fundamental nutrients and to avoid toxins or other harmful compounds. Surprisingly, despite this strong conservation of basic taste qualities between vertebrates and invertebrates, the receptors and signaling mechanisms that mediate taste in each are highly divergent. The identification over the last two decades of receptors and other molecules that mediate taste has led to stunning advances in our understanding of the basic mechanisms of transduction and coding of information by the gustatory systems of vertebrates and invertebrates. In this review, we discuss recent advances in taste research, mainly from the fly and mammalian systems, and we highlight principles that are common across species, despite stark differences in receptor types. PMID:24607224

  9. Electromagnetic particle simulation codes

    NASA Technical Reports Server (NTRS)

    Pritchett, P. L.

    1985-01-01

    Electromagnetic particle simulations solve the full set of Maxwell's equations. They thus include the effects of self-consistent electric and magnetic fields, magnetic induction, and electromagnetic radiation. The algorithms for an electromagnetic code which works directly with the electric and magnetic fields are described. The fields and current are separated into transverse and longitudinal components. The transverse E and B fields are integrated in time using a leapfrog scheme applied to the Fourier components. The particle pushing is performed via the relativistic Lorentz force equation for the particle momentum. As an example, simulation results are presented for the electron cyclotron maser instability which illustrate the importance of relativistic effects on the wave-particle resonance condition and on wave dispersion.

  10. Modeling anomalous radial transport in kinetic transport codes

    NASA Astrophysics Data System (ADS)

    Bodi, K.; Krasheninnikov, S. I.; Cohen, R. H.; Rognlien, T. D.

    2009-11-01

    Anomalous transport is typically the dominant component of the radial transport in magnetically confined plasmas, where the physical origin of this transport is believed to be plasma turbulence. A model is presented for anomalous transport that can be used in continuum kinetic edge codes like TEMPEST, NEO and the next-generation code being developed by the Edge Simulation Laboratory. The model can also be adapted to particle-based codes. It is demonstrated that the model with a velocity-dependent diffusion and convection terms can match a diagonal gradient-driven transport matrix as found in contemporary fluid codes, but can also include off-diagonal effects. The anomalous transport model is also combined with particle drifts and a particle/energy-conserving Krook collision operator to study possible synergistic effects with neoclassical transport. For the latter study, a velocity-independent anomalous diffusion coefficient is used to mimic the effect of long-wavelength ExB turbulence.

  11. Surface acoustic wave coding for orthogonal frequency coded devices

    NASA Technical Reports Server (NTRS)

    Malocha, Donald (Inventor); Kozlovski, Nikolai (Inventor)

    2011-01-01

    Methods and systems for coding SAW OFC devices to mitigate code collisions in a wireless multi-tag system. Each device producing plural stepped frequencies as an OFC signal with a chip offset delay to increase code diversity. A method for assigning a different OCF to each device includes using a matrix based on the number of OFCs needed and the number chips per code, populating each matrix cell with OFC chip, and assigning the codes from the matrix to the devices. The asynchronous passive multi-tag system includes plural surface acoustic wave devices each producing a different OFC signal having the same number of chips and including a chip offset time delay, an algorithm for assigning OFCs to each device, and a transceiver to transmit an interrogation signal and receive OFC signals in response with minimal code collisions during transmission.

  12. CAFE: A New Relativistic MHD Code

    NASA Astrophysics Data System (ADS)

    Lora-Clavijo, F. D.; Cruz-Osorio, A.; Guzmán, F. S.

    2015-06-01

    We introduce CAFE, a new independent code designed to solve the equations of relativistic ideal magnetohydrodynamics (RMHD) in three dimensions. We present the standard tests for an RMHD code and for the relativistic hydrodynamics regime because we have not reported them before. The tests include the one-dimensional Riemann problems related to blast waves, head-on collisions of streams, and states with transverse velocities, with and without magnetic field, which is aligned or transverse, constant or discontinuous across the initial discontinuity. Among the two-dimensional (2D) and 3D tests without magnetic field, we include the 2D Riemann problem, a one-dimensional shock tube along a diagonal, the high-speed Emery wind tunnel, the Kelvin-Helmholtz (KH) instability, a set of jets, and a 3D spherical blast wave, whereas in the presence of a magnetic field we show the magnetic rotor, the cylindrical explosion, a case of Kelvin-Helmholtz instability, and a 3D magnetic field advection loop. The code uses high-resolution shock-capturing methods, and we present the error analysis for a combination that uses the Harten, Lax, van Leer, and Einfeldt (HLLE) flux formula combined with a linear, piecewise parabolic method and fifth-order weighted essentially nonoscillatory reconstructors. We use the flux-constrained transport and the divergence cleaning methods to control the divergence-free magnetic field constraint.

  13. CAFE: A NEW RELATIVISTIC MHD CODE

    SciTech Connect

    Lora-Clavijo, F. D.; Cruz-Osorio, A.; Guzmán, F. S. E-mail: aosorio@astro.unam.mx

    2015-06-22

    We introduce CAFE, a new independent code designed to solve the equations of relativistic ideal magnetohydrodynamics (RMHD) in three dimensions. We present the standard tests for an RMHD code and for the relativistic hydrodynamics regime because we have not reported them before. The tests include the one-dimensional Riemann problems related to blast waves, head-on collisions of streams, and states with transverse velocities, with and without magnetic field, which is aligned or transverse, constant or discontinuous across the initial discontinuity. Among the two-dimensional (2D) and 3D tests without magnetic field, we include the 2D Riemann problem, a one-dimensional shock tube along a diagonal, the high-speed Emery wind tunnel, the Kelvin–Helmholtz (KH) instability, a set of jets, and a 3D spherical blast wave, whereas in the presence of a magnetic field we show the magnetic rotor, the cylindrical explosion, a case of Kelvin–Helmholtz instability, and a 3D magnetic field advection loop. The code uses high-resolution shock-capturing methods, and we present the error analysis for a combination that uses the Harten, Lax, van Leer, and Einfeldt (HLLE) flux formula combined with a linear, piecewise parabolic method and fifth-order weighted essentially nonoscillatory reconstructors. We use the flux-constrained transport and the divergence cleaning methods to control the divergence-free magnetic field constraint.

  14. Temporal coding by populations of auditory receptor neurons.

    PubMed

    Sabourin, Patrick; Pollack, Gerald S

    2010-03-01

    Auditory receptor neurons of crickets are most sensitive to either low or high sound frequencies. Earlier work showed that the temporal coding properties of first-order auditory interneurons are matched to the temporal characteristics of natural low- and high-frequency stimuli (cricket songs and bat echolocation calls, respectively). We studied the temporal coding properties of receptor neurons and used modeling to investigate how activity within populations of low- and high-frequency receptors might contribute to the coding properties of interneurons. We confirm earlier findings that individual low-frequency-tuned receptors code stimulus temporal pattern poorly, but show that coding performance of a receptor population increases markedly with population size, due in part to low redundancy among the spike trains of different receptors. By contrast, individual high-frequency-tuned receptors code a stimulus temporal pattern fairly well and, because their spike trains are redundant, there is only a slight increase in coding performance with population size. The coding properties of low- and high-frequency receptor populations resemble those of interneurons in response to low- and high-frequency stimuli, suggesting that coding at the interneuron level is partly determined by the nature and organization of afferent input. Consistent with this, the sound-frequency-specific coding properties of an interneuron, previously demonstrated by analyzing its spike train, are also apparent in the subthreshold fluctuations in membrane potential that are generated by synaptic input from receptor neurons.

  15. Adaptive differential pulse-code modulation with adaptive bit allocation

    NASA Astrophysics Data System (ADS)

    Frangoulis, E. D.; Yoshida, K.; Turner, L. F.

    1984-08-01

    Studies have been conducted regarding the possibility to obtain good quality speech at data rates in the range from 16 kbit/s to 32 kbit/s. The techniques considered are related to adaptive predictive coding (APC) and adaptive differential pulse-code modulation (ADPCM). At 16 kbit/s adaptive transform coding (ATC) has also been used. The present investigation is concerned with a new method of speech coding. The described method employs adaptive bit allocation, similar to that used in adaptive transform coding, together with adaptive differential pulse-code modulation, employing first-order prediction. The new method has the objective to improve the quality of the speech over that which can be obtained with conventional ADPCM employing a fourth-order predictor. Attention is given to the ADPCM-AB system, the design of a subjective test, and the application of switched preemphasis to ADPCM.

  16. Multi-stage decoding of multi-level modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao; Costello, Daniel J., Jr.

    1991-01-01

    Various types of multi-stage decoding for multi-level modulation codes are investigated. It is shown that if the component codes of a multi-level modulation code and types of decoding at various stages are chosen properly, high spectral efficiency and large coding gain can be achieved with reduced decoding complexity. Particularly, it is shown that the difference in performance between the suboptimum multi-stage soft-decision maximum likelihood decoding of a modulation code and the single-stage optimum soft-decision decoding of the code is very small, only a fraction of dB loss in signal to noise ratio at a bit error rate (BER) of 10(exp -6).

  17. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  18. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  19. New opportunities seen for independents

    SciTech Connect

    Adams, G.A. )

    1990-10-22

    The collapse of gas and oil prices in the mid-1980s significantly reduced the number of independent exploration companies. At the same time, a fundamental shift occurred among major oil companies as they allocated their exploration budgets toward international operations and made major production purchases. Several large independents also embraced a philosophy of budget supplementation through joint venture partnership arrangements. This has created a unique and unusual window of opportunity for the smaller independents (defined for this article as exploration and production companies with a market value of less than $1 billion) to access the extensive and high quality domestic prospect inventories of the major and large independent oil and gas companies and to participate in the search for large reserve targets on attractive joint venture terms. Participation in these types of joint ventures, in conjunction with internally generated plays selected through the use of today's advanced technology (computer-enhanced, high-resolution seismic; horizontal drilling; etc.) and increasing process for oil and natural gas, presents the domestic exploration-oriented independent with an attractive money-making opportunity for the 1990s.

  20. Some easily analyzable convolutional codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R.; Dolinar, S.; Pollara, F.; Vantilborg, H.

    1989-01-01

    Convolutional codes have played and will play a key role in the downlink telemetry systems on many NASA deep-space probes, including Voyager, Magellan, and Galileo. One of the chief difficulties associated with the use of convolutional codes, however, is the notorious difficulty of analyzing them. Given a convolutional code as specified, say, by its generator polynomials, it is no easy matter to say how well that code will perform on a given noisy channel. The usual first step in such an analysis is to computer the code's free distance; this can be done with an algorithm whose complexity is exponential in the code's constraint length. The second step is often to calculate the transfer function in one, two, or three variables, or at least a few terms in its power series expansion. This step is quite hard, and for many codes of relatively short constraint lengths, it can be intractable. However, a large class of convolutional codes were discovered for which the free distance can be computed by inspection, and for which there is a closed-form expression for the three-variable transfer function. Although for large constraint lengths, these codes have relatively low rates, they are nevertheless interesting and potentially useful. Furthermore, the ideas developed here to analyze these specialized codes may well extend to a much larger class.

  1. Interframe vector wavelet coding technique

    NASA Astrophysics Data System (ADS)

    Wus, John P.; Li, Weiping

    1997-01-01

    Wavelet coding is often used to divide an image into multi- resolution wavelet coefficients which are quantized and coded. By 'vectorizing' scalar wavelet coding and combining this with vector quantization (VQ), vector wavelet coding (VWC) can be implemented. Using a finite number of states, finite-state vector quantization (FSVQ) takes advantage of the similarity between frames by incorporating memory into the video coding system. Lattice VQ eliminates the potential mismatch that could occur using pre-trained VQ codebooks. It also eliminates the need for codebook storage in the VQ process, thereby creating a more robust coding system. Therefore, by using the VWC coding method in conjunction with the FSVQ system and lattice VQ, the formulation of a high quality very low bit rate coding systems is proposed. A coding system using a simple FSVQ system where the current state is determined by the previous channel symbol only is developed. To achieve a higher degree of compression, a tree-like FSVQ system is implemented. The groupings are done in this tree-like structure from the lower subbands to the higher subbands in order to exploit the nature of subband analysis in terms of the parent-child relationship. Class A and Class B video sequences from the MPEG-IV testing evaluations are used in the evaluation of this coding method.

  2. Preliminary Results from the Application of Automated Adjoint Code Generation to CFL3D

    NASA Technical Reports Server (NTRS)

    Carle, Alan; Fagan, Mike; Green, Lawrence L.

    1998-01-01

    This report describes preliminary results obtained using an automated adjoint code generator for Fortran to augment a widely-used computational fluid dynamics flow solver to compute derivatives. These preliminary results with this augmented code suggest that, even in its infancy, the automated adjoint code generator can accurately and efficiently deliver derivatives for use in transonic Euler-based aerodynamic shape optimization problems with hundreds to thousands of independent design variables.

  3. International Code Assessment and Applications Program: Summary of code assessment studies concerning RELAP5/MOD2, RELAP5/MOD3, and TRAC-B. International Agreement Report

    SciTech Connect

    Schultz, R.R.

    1993-12-01

    Members of the International Code Assessment Program (ICAP) have assessed the US Nuclear Regulatory Commission (USNRC) advanced thermal-hydraulic codes over the past few years in a concerted effort to identify deficiencies, to define user guidelines, and to determine the state of each code. The results of sixty-two code assessment reviews, conducted at INEL, are summarized. Code deficiencies are discussed and user recommended nodalizations investigated during the course of conducting the assessment studies and reviews are listed. All the work that is summarized was done using the RELAP5/MOD2, RELAP5/MOD3, and TRAC-B codes.

  4. Independent bilateral primary bronchial carcinomas

    PubMed Central

    Chaudhuri, M. Ray

    1971-01-01

    Independent bilateral primary bronchial carcinomas are not common. Since Beyreuther's description in 1924, 16 well-documented cases of independent primary bronchial carcinomas of different histology have been described. From 1965 to 1970, eight cases were seen at the London Chest Hospital. In order to make the diagnosis of a second primary bronchial carcinoma, each tumour should be malignant and neither should be a metastasis from the other. To meet this last criterion, the histopathological features of the two tumours must be different. Many cases have been described in the literature as double primary bronchial carcinomas where the second primary had the same histological features as the first. Images PMID:4327711

  5. The Independent Technical Analysis Process

    SciTech Connect

    Duberstein, Corey A.; Ham, Kenneth D.; Dauble, Dennis D.; Johnson, Gary E.

    2007-04-13

    The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. In the past, regional parties have interacted with a single entity, the Fish Passage Center to access the data, analyses, and coordination related to fish passage. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities.

  6. Two-dimensional aperture coding for magnetic sector mass spectrometry.

    PubMed

    Russell, Zachary E; Chen, Evan X; Amsden, Jason J; Wolter, Scott D; Danell, Ryan M; Parker, Charles B; Stoner, Brian R; Gehm, Michael E; Brady, David J; Glass, Jeffrey T

    2015-02-01

    In mass spectrometer design, there has been a historic belief that there exists a fundamental trade-off between instrument size, throughput, and resolution. When miniaturizing a traditional system, performance loss in either resolution or throughput would be expected. However, in optical spectroscopy, both one-dimensional (1D) and two-dimensional (2D) aperture coding have been used for many years to break a similar trade-off. To provide a viable path to miniaturization for harsh environment field applications, we are investigating similar concepts in sector mass spectrometry. Recently, we demonstrated the viability of 1D aperture coding and here we provide a first investigation of 2D coding. In coded optical spectroscopy, 2D coding is preferred because of increased measurement diversity for improved conditioning and robustness of the result. To investigate its viability in mass spectrometry, analytes of argon, acetone, and ethanol were detected using a custom 90-degree magnetic sector mass spectrometer incorporating 2D coded apertures. We developed a mathematical forward model and reconstruction algorithm to successfully reconstruct the mass spectra from the 2D spatially coded ion positions. This 2D coding enabled a 3.5× throughput increase with minimal decrease in resolution. Several challenges were overcome in the mass spectrometer design to enable this coding, including the need for large uniform ion flux, a wide gap magnetic sector that maintains field uniformity, and a high resolution 2D detection system for ion imaging. Furthermore, micro-fabricated 2D coded apertures incorporating support structures were developed to provide a viable design that allowed ion transmission through the open elements of the code. PMID:25510933

  7. Nonlinear, nonbinary cyclic group codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    New cyclic group codes of length 2(exp m) - 1 over (m - j)-bit symbols are introduced. These codes can be systematically encoded and decoded algebraically. The code rates are very close to Reed-Solomon (RS) codes and are much better than Bose-Chaudhuri-Hocquenghem (BCH) codes (a former alternative). The binary (m - j)-tuples are identified with a subgroup of the binary m-tuples which represents the field GF(2 exp m). Encoding is systematic and involves a two-stage procedure consisting of the usual linear feedback register (using the division or check polynomial) and a small table lookup. For low rates, a second shift-register encoding operation may be invoked. Decoding uses the RS error-correcting procedures for the m-tuple codes for m = 4, 5, and 6.

  8. Probable relationship between partitions of the set of codons and the origin of the genetic code.

    PubMed

    Salinas, Dino G; Gallardo, Mauricio O; Osorio, Manuel I

    2014-03-01

    Here we study the distribution of randomly generated partitions of the set of amino acid-coding codons. Some results are an application from a previous work, about the Stirling numbers of the second kind and triplet codes, both to the cases of triplet codes having four stop codons, as in mammalian mitochondrial genetic code, and hypothetical doublet codes. Extending previous results, in this work it is found that the most probable number of blocks of synonymous codons, in a genetic code, is similar to the number of amino acids when there are four stop codons, as well as it could be for a primigenious doublet code. Also it is studied the integer partitions associated to patterns of synonymous codons and it is shown, for the canonical code, that the standard deviation inside an integer partition is one of the most probable. We think that, in some early epoch, the genetic code might have had a maximum of the disorder or entropy, independent of the assignment between codons and amino acids, reaching a state similar to "code freeze" proposed by Francis Crick. In later stages, maybe deterministic rules have reassigned codons to amino acids, forming the natural codes, such as the canonical code, but keeping the numerical features describing the set partitions and the integer partitions, like a "fossil numbers"; both kinds of partitions about the set of amino acid-coding codons.

  9. A benchmark study for glacial isostatic adjustment codes

    NASA Astrophysics Data System (ADS)

    Spada, G.; Barletta, V. R.; Klemann, V.; Riva, R. E. M.; Martinec, Z.; Gasperini, P.; Lund, B.; Wolf, D.; Vermeersen, L. L. A.; King, M. A.

    2011-04-01

    The study of glacial isostatic adjustment (GIA) is gaining an increasingly important role within the geophysical community. Understanding the response of the Earth to loading is crucial in various contexts, ranging from the interpretation of modern satellite geodetic measurements (e.g. GRACE and GOCE) to the projections of future sea level trends in response to climate change. Modern modelling approaches to GIA are based on various techniques that range from purely analytical formulations to fully numerical methods. Despite various teams independently investigating GIA, we do not have a suitably large set of agreed numerical results through which the methods may be validated; a community benchmark data set would clearly be valuable. Following the example of the mantle convection community, here we present, for the first time, the results of a benchmark study of codes designed to model GIA. This has taken place within a collaboration facilitated through European Cooperation in Science and Technology (COST) Action ES0701. The approaches benchmarked are based on significantly different codes and different techniques. The test computations are based on models with spherical symmetry and Maxwell rheology and include inputs from different methods and solution techniques: viscoelastic normal modes, spectral-finite elements and finite elements. The tests involve the loading and tidal Love numbers and their relaxation spectra, the deformation and gravity variations driven by surface loads characterized by simple geometry and time history and the rotational fluctuations in response to glacial unloading. In spite of the significant differences in the numerical methods employed, the test computations show a satisfactory agreement between the results provided by the participants.

  10. Constrained coding for the deep-space optical channel

    NASA Technical Reports Server (NTRS)

    Moision, B. E.; Hamkins, J.

    2002-01-01

    We investigate methods of coding for a channel subject to a large dead-time constraint, i.e. a constraint on the minimum spacing between transmitted pulses, with the deep-space optical channel as the motivating example.

  11. Explosive Formulation Code Naming SOP

    SciTech Connect

    Martz, H. E.

    2014-09-19

    The purpose of this SOP is to provide a procedure for giving individual HME formulations code names. A code name for an individual HME formulation consists of an explosive family code, given by the classified guide, followed by a dash, -, and a number. If the formulation requires preparation such as packing or aging, these add additional groups of symbols to the X-ray specimen name.

  12. Variable Coded Modulation software simulation

    NASA Astrophysics Data System (ADS)

    Sielicki, Thomas A.; Hamkins, Jon; Thorsen, Denise

    This paper reports on the design and performance of a new Variable Coded Modulation (VCM) system. This VCM system comprises eight of NASA's recommended codes from the Consultative Committee for Space Data Systems (CCSDS) standards, including four turbo and four AR4JA/C2 low-density parity-check codes, together with six modulations types (BPSK, QPSK, 8-PSK, 16-APSK, 32-APSK, 64-APSK). The signaling protocol for the transmission mode is based on a CCSDS recommendation. The coded modulation may be dynamically chosen, block to block, to optimize throughput.

  13. Astrophysics Source Code Library Enhancements

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.

    2015-09-01

    The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.

  14. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P. R.; Scannicchio, D.; Trovati, S.; Villari, R.; Wilson, T.

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  15. High Order Modulation Protograph Codes

    NASA Technical Reports Server (NTRS)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  16. The FLUKA Code: an Overview

    SciTech Connect

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M.V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P.R.; /Milan U. /INFN, Milan /Pavia U. /INFN, Pavia /CERN /Siegen U. /Houston U. /SLAC /Frascati /NASA, Houston /ENEA, Frascati

    2005-11-09

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  17. Haptic Tracking Permits Bimanual Independence

    ERIC Educational Resources Information Center

    Rosenbaum, David A.; Dawson, Amanda A.; Challis, John H.

    2006-01-01

    This study shows that in a novel task--bimanual haptic tracking--neurologically normal human adults can move their 2 hands independently for extended periods of time with little or no training. Participants lightly touched buttons whose positions were moved either quasi-randomly in the horizontal plane by 1 or 2 human drivers (Experiment 1), in…

  18. 10 Questions about Independent Reading

    ERIC Educational Resources Information Center

    Truby, Dana

    2012-01-01

    Teachers know that establishing a robust independent reading program takes more than giving kids a little quiet time after lunch. But how do they set up a program that will maximize their students' gains? Teachers have to know their students' reading levels inside and out, help them find just-right books, and continue to guide them during…

  19. Dimension independence in exterior algebra.

    PubMed Central

    Hawrylycz, M

    1995-01-01

    The identities between homogeneous expressions in rank 1 vectors and rank n - 1 covectors in a Grassmann-Cayley algebra of rank n, in which one set occurs multilinearly, are shown to represent a set of dimension-independent identities. The theorem yields an infinite set of nontrivial geometric identities from a given identity. PMID:11607520

  20. Field Independence: Reviewing the Evidence

    ERIC Educational Resources Information Center

    Evans, Carol; Richardson, John T. E.; Waring, Michael

    2013-01-01

    Background: The construct of ?eld independence (FI) remains one of the most widely cited notions in research on cognitive style and on learning and instruction more generally. However, a great deal of confusion continues to exist around the de?nition of FI, its measurement, and the interpretation of research results, all of which have served to…

  1. Independent Study Project, Topic: Topology.

    ERIC Educational Resources Information Center

    Notre Dame High School, Easton, PA.

    Using this guide and the four popular books noted in it, a student, working independently, will learn about some of the classical ideas and problems of topology: the Meobius strip and Klein bottle, the four color problem, genus of a surface, networks, Euler's formula, and the Jordan Curve Theorem. The unit culminates in a project of the students'…

  2. Boston: Cradle of American Independence

    ERIC Educational Resources Information Center

    Community College Journal, 2004

    2004-01-01

    The 2005 American Association of Community Colleges Annual Convention will be held April 6-9 in Boston. While thoroughly modern, the iconic city's identity is firmly rooted in the past. As the cradle of American independence, Boston's long history is an integral part of the American fabric. Adams, Revere, Hancock are more than historical figures;…

  3. Employee and independent contractor relationships.

    PubMed

    Wren, K R; Wren, T L; Monti, E J; Turco, S J

    1999-05-01

    Most practitioners find themselves at a disadvantage in dealing with business issues and relationships. As health care continues to change, knowledge of contracts and business relationships will help CRNA practitioners navigate new as well as traditional practice settings. This article discusses the advantages and disadvantages of two business relationships: employee and independent contractor. PMID:10504911

  4. Selective Influence through Conditional Independence.

    ERIC Educational Resources Information Center

    Dzhafarov, Ehtibar N.

    2003-01-01

    Presents a generalization and improvement for the definition proposed by E. Dzhafarov (2001) for selectiveness in the dependence of several random variables on several (sets of) external factors. This generalization links the notion of selective influence with that of conditional independence. (SLD)

  5. The KIDTALK Behavior and Language Code: Manual and Coding Protocol.

    ERIC Educational Resources Information Center

    Delaney, Elizabeth M.; Ezell, Sara S.; Solomon, Ned A.; Hancock, Terry B.; Kaiser, Ann P.

    Developed as part of the Milieu Language Teaching Project at the John F. Kennedy Center at Vanderbilt University in Nashville, Tennessee, this KIDTALK Behavior-Language Coding Protocol and manual measures behavior occurring during adult-child interactions. The manual is divided into 5 distinct sections: (1) the adult behavior codes describe…

  6. Telescope Adaptive Optics Code

    2005-07-28

    The Telescope AO Code has general adaptive optics capabilities plus specialized models for three telescopes with either adaptive optics or active optics systems. It has the capability to generate either single-layer or distributed Kolmogorov turbulence phase screens using the FFT. Missing low order spatial frequencies are added using the Karhunen-Loeve expansion. The phase structure curve is extremely dose to the theoreUcal. Secondly, it has the capability to simulate an adaptive optics control systems. The defaultmore » parameters are those of the Keck II adaptive optics system. Thirdly, it has a general wave optics capability to model the science camera halo due to scintillation from atmospheric turbulence and the telescope optics. Although this capability was implemented for the Gemini telescopes, the only default parameter specific to the Gemini telescopes is the primary mirror diameter. Finally, it has a model for the LSST active optics alignment strategy. This last model is highly specific to the LSST« less

  7. Patched Conic Trajectory Code

    NASA Technical Reports Server (NTRS)

    Park, Brooke Anderson; Wright, Henry

    2012-01-01

    PatCon code was developed to help mission designers run trade studies on launch and arrival times for any given planet. Initially developed in Fortran, the required inputs included launch date, arrival date, and other orbital parameters of the launch planet and arrival planets at the given dates. These parameters include the position of the planets, the eccentricity, semi-major axes, argument of periapsis, ascending node, and inclination of the planets. With these inputs, a patched conic approximation is used to determine the trajectory. The patched conic approximation divides the planetary mission into three parts: (1) the departure phase, in which the two relevant bodies are Earth and the spacecraft, and where the trajectory is a departure hyperbola with Earth at the focus; (2) the cruise phase, in which the two bodies are the Sun and the spacecraft, and where the trajectory is a transfer ellipse with the Sun at the focus; and (3) the arrival phase, in which the two bodies are the target planet and the spacecraft, where the trajectory is an arrival hyperbola with the planet as the focus.

  8. Error coding simulations in C

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1994-01-01

    When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.

  9. Error coding simulations in C

    NASA Astrophysics Data System (ADS)

    Noble, Viveca K.

    1994-10-01

    When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.

  10. DANTSYS: A diffusion accelerated neutral particle transport code system

    SciTech Connect

    Alcouffe, R.E.; Baker, R.S.; Brinkley, F.W.; Marr, D.R.; O`Dell, R.D.; Walters, W.F.

    1995-06-01

    The DANTSYS code package includes the following transport codes: ONEDANT, TWODANT, TWODANT/GQ, TWOHEX, and THREEDANT. The DANTSYS code package is a modular computer program package designed to solve the time-independent, multigroup discrete ordinates form of the boltzmann transport equation in several different geometries. The modular construction of the package separates the input processing, the transport equation solving, and the post processing (or edit) functions into distinct code modules: the Input Module, one or more Solver Modules, and the Edit Module, respectively. The Input and Edit Modules are very general in nature and are common to all the Solver Modules. The ONEDANT Solver Module contains a one-dimensional (slab, cylinder, and sphere), time-independent transport equation solver using the standard diamond-differencing method for space/angle discretization. Also included in the package are solver Modules named TWODANT, TWODANT/GQ, THREEDANT, and TWOHEX. The TWODANT Solver Module solves the time-independent two-dimensional transport equation using the diamond-differencing method for space/angle discretization. The authors have also introduced an adaptive weighted diamond differencing (AWDD) method for the spatial and angular discretization into TWODANT as an option. The TWOHEX Solver Module solves the time-independent two-dimensional transport equation on an equilateral triangle spatial mesh. The THREEDANT Solver Module solves the time independent, three-dimensional transport equation for XYZ and RZ{Theta} symmetries using both diamond differencing with set-to-zero fixup and the AWDD method. The TWODANT/GQ Solver Module solves the 2-D transport equation in XY and RZ symmetries using a spatial mesh of arbitrary quadrilaterals. The spatial differencing method is based upon the diamond differencing method with set-to-zero fixup with changes to accommodate the generalized spatial meshing.

  11. Codon Distribution in Error-Detecting Circular Codes.

    PubMed

    Fimmel, Elena; Strüngmann, Lutz

    2016-03-15

    In 1957, Francis Crick et al. suggested an ingenious explanation for the process of frame maintenance. The idea was based on the notion of comma-free codes. Although Crick's hypothesis proved to be wrong, in 1996, Arquès and Michel discovered the existence of a weaker version of such codes in eukaryote and prokaryote genomes, namely the so-called circular codes. Since then, circular code theory has invariably evoked great interest and made significant progress. In this article, the codon distributions in maximal comma-free, maximal self-complementary C³ and maximal self-complementary circular codes are discussed, i.e., we investigate in how many of such codes a given codon participates. As the main (and surprising) result, it is shown that the codons can be separated into very few classes (three, or five, or six) with respect to their frequency. Moreover, the distribution classes can be hierarchically ordered as refinements from maximal comma-free codes via maximal self-complementary C(3) codes to maximal self-complementary circular codes.

  12. Codon Distribution in Error-Detecting Circular Codes.

    PubMed

    Fimmel, Elena; Strüngmann, Lutz

    2016-01-01

    In 1957, Francis Crick et al. suggested an ingenious explanation for the process of frame maintenance. The idea was based on the notion of comma-free codes. Although Crick's hypothesis proved to be wrong, in 1996, Arquès and Michel discovered the existence of a weaker version of such codes in eukaryote and prokaryote genomes, namely the so-called circular codes. Since then, circular code theory has invariably evoked great interest and made significant progress. In this article, the codon distributions in maximal comma-free, maximal self-complementary C³ and maximal self-complementary circular codes are discussed, i.e., we investigate in how many of such codes a given codon participates. As the main (and surprising) result, it is shown that the codons can be separated into very few classes (three, or five, or six) with respect to their frequency. Moreover, the distribution classes can be hierarchically ordered as refinements from maximal comma-free codes via maximal self-complementary C(3) codes to maximal self-complementary circular codes. PMID:26999215

  13. Correlation approach to identify coding regions in DNA sequences

    NASA Technical Reports Server (NTRS)

    Ossadnik, S. M.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Mantegna, R. N.; Peng, C. K.; Simons, M.; Stanley, H. E.

    1994-01-01

    Recently, it was observed that noncoding regions of DNA sequences possess long-range power-law correlations, whereas coding regions typically display only short-range correlations. We develop an algorithm based on this finding that enables investigators to perform a statistical analysis on long DNA sequences to locate possible coding regions. The algorithm is particularly successful in predicting the location of lengthy coding regions. For example, for the complete genome of yeast chromosome III (315,344 nucleotides), at least 82% of the predictions correspond to putative coding regions; the algorithm correctly identified all coding regions larger than 3000 nucleotides, 92% of coding regions between 2000 and 3000 nucleotides long, and 79% of coding regions between 1000 and 2000 nucleotides. The predictive ability of this new algorithm supports the claim that there is a fundamental difference in the correlation property between coding and noncoding sequences. This algorithm, which is not species-dependent, can be implemented with other techniques for rapidly and accurately locating relatively long coding regions in genomic sequences.

  14. Codon Distribution in Error-Detecting Circular Codes

    PubMed Central

    Fimmel, Elena; Strüngmann, Lutz

    2016-01-01

    In 1957, Francis Crick et al. suggested an ingenious explanation for the process of frame maintenance. The idea was based on the notion of comma-free codes. Although Crick’s hypothesis proved to be wrong, in 1996, Arquès and Michel discovered the existence of a weaker version of such codes in eukaryote and prokaryote genomes, namely the so-called circular codes. Since then, circular code theory has invariably evoked great interest and made significant progress. In this article, the codon distributions in maximal comma-free, maximal self-complementary C3 and maximal self-complementary circular codes are discussed, i.e., we investigate in how many of such codes a given codon participates. As the main (and surprising) result, it is shown that the codons can be separated into very few classes (three, or five, or six) with respect to their frequency. Moreover, the distribution classes can be hierarchically ordered as refinements from maximal comma-free codes via maximal self-complementary C3 codes to maximal self-complementary circular codes. PMID:26999215

  15. Benchmarking NNWSI flow and transport codes: COVE 1 results

    SciTech Connect

    Hayden, N.K.

    1985-06-01

    The code verification (COVE) activity of the Nevada Nuclear Waste Storage Investigations (NNWSI) Project is the first step in certification of flow and transport codes used for NNWSI performance assessments of a geologic repository for disposing of high-level radioactive wastes. The goals of the COVE activity are (1) to demonstrate and compare the numerical accuracy and sensitivity of certain codes, (2) to identify and resolve problems in running typical NNWSI performance assessment calculations, and (3) to evaluate computer requirements for running the codes. This report describes the work done for COVE 1, the first step in benchmarking some of the codes. Isothermal calculations for the COVE 1 benchmarking have been completed using the hydrologic flow codes SAGUARO, TRUST, and GWVIP; the radionuclide transport codes FEMTRAN and TRUMP; and the coupled flow and transport code TRACR3D. This report presents the results of three cases of the benchmarking problem solved for COVE 1, a comparison of the results, questions raised regarding sensitivities to modeling techniques, and conclusions drawn regarding the status and numerical sensitivities of the codes. 30 refs.

  16. SCAMPI: A code package for cross-section processing

    SciTech Connect

    Parks, C.V.; Petrie, L.M.; Bowman, S.M.; Broadhead, B.L.; Greene, N.M.; White, J.E.

    1996-04-01

    The SCAMPI code package consists of a set of SCALE and AMPX modules that have been assembled to facilitate user needs for preparation of problem-specific, multigroup cross-section libraries. The function of each module contained in the SCANTI code package is discussed, along with illustrations of their use in practical analyses. Ideas are presented for future work that can enable one-step processing from a fine-group, problem-independent library to a broad-group, problem-specific library ready for a shielding analysis.

  17. Progress in cultivation-independent phyllosphere microbiology.

    PubMed

    Müller, Thomas; Ruppel, Silke

    2014-01-01

    Most microorganisms of the phyllosphere are nonculturable in commonly used media and culture conditions, as are those in other natural environments. This review queries the reasons for their 'noncultivability' and assesses developments in phyllospere microbiology that have been achieved cultivation independently over the last 4 years. Analyses of total microbial communities have revealed a comprehensive microbial diversity. 16S rRNA gene amplicon sequencing and metagenomic sequencing were applied to investigate plant species, location and season as variables affecting the composition of these communities. In continuation to culture-based enzymatic and metabolic studies with individual isolates, metaproteogenomic approaches reveal a great potential to study the physiology of microbial communities in situ. Culture-independent microbiological technologies as well advances in plant genetics and biochemistry provide methodological preconditions for exploring the interactions between plants and their microbiome in the phyllosphere. Improving and combining cultivation and culture-independent techniques can contribute to a better understanding of the phyllosphere ecology. This is essential, for example, to avoid human-pathogenic bacteria in plant food.

  18. Progress in cultivation-independent phyllosphere microbiology

    PubMed Central

    Müller, Thomas; Ruppel, Silke

    2014-01-01

    Most microorganisms of the phyllosphere are nonculturable in commonly used media and culture conditions, as are those in other natural environments. This review queries the reasons for their ‘noncultivability’ and assesses developments in phyllospere microbiology that have been achieved cultivation independently over the last 4 years. Analyses of total microbial communities have revealed a comprehensive microbial diversity. 16S rRNA gene amplicon sequencing and metagenomic sequencing were applied to investigate plant species, location and season as variables affecting the composition of these communities. In continuation to culture-based enzymatic and metabolic studies with individual isolates, metaproteogenomic approaches reveal a great potential to study the physiology of microbial communities in situ. Culture-independent microbiological technologies as well advances in plant genetics and biochemistry provide methodological preconditions for exploring the interactions between plants and their microbiome in the phyllosphere. Improving and combining cultivation and culture-independent techniques can contribute to a better understanding of the phyllosphere ecology. This is essential, for example, to avoid human–pathogenic bacteria in plant food. PMID:24003903

  19. Experimental measurement-device-independent entanglement detection.

    PubMed

    Nawareg, Mohamed; Muhammad, Sadiq; Amselem, Elias; Bourennane, Mohamed

    2015-01-01

    Entanglement is one of the most puzzling features of quantum theory and of great importance for the new field of quantum information. The determination whether a given state is entangled or not is one of the most challenging open problems of the field. Here we report on the experimental demonstration of measurement-device-independent (MDI) entanglement detection using witness method for general two qubits photon polarization systems. In the MDI settings, there is no requirement to assume perfect implementations or neither to trust the measurement devices. This experimental demonstration can be generalized for the investigation of properties of quantum systems and for the realization of cryptography and communication protocols. PMID:25649664

  20. Independent evolution of four heme peroxidase superfamilies.

    PubMed

    Zámocký, Marcel; Hofbauer, Stefan; Schaffner, Irene; Gasselhuber, Bernhard; Nicolussi, Andrea; Soudi, Monika; Pirker, Katharina F; Furtmüller, Paul G; Obinger, Christian

    2015-05-15

    Four heme peroxidase superfamilies (peroxidase-catalase, peroxidase-cyclooxygenase, peroxidase-chlorite dismutase and peroxidase-peroxygenase superfamily) arose independently during evolution, which differ in overall fold, active site architecture and enzymatic activities. The redox cofactor is heme b or posttranslationally modified heme that is ligated by either histidine or cysteine. Heme peroxidases are found in all kingdoms of life and typically catalyze the one- and two-electron oxidation of a myriad of organic and inorganic substrates. In addition to this peroxidatic activity distinct (sub)families show pronounced catalase, cyclooxygenase, chlorite dismutase or peroxygenase activities. Here we describe the phylogeny of these four superfamilies and present the most important sequence signatures and active site architectures. The classification of families is described as well as important turning points in evolution. We show that at least three heme peroxidase superfamilies have ancient prokaryotic roots with several alternative ways of divergent evolution. In later evolutionary steps, they almost always produced highly evolved and specialized clades of peroxidases in eukaryotic kingdoms with a significant portion of such genes involved in coding various fusion proteins with novel physiological functions.