Science.gov

Sample records for investigators independently coded

  1. The design of relatively machine-independent code generators

    NASA Technical Reports Server (NTRS)

    Noonan, R. E.

    1979-01-01

    Two complementary approaches were investigated. In the first approach software design techniques were used to design the structure of a code generator for Halmat. The major result was the development of an intermediate code form known as 7UP. The second approach viewed the problem as one in providing a tool to the code generator programmer. The major result was the development of a non-procedural, problem oriented language known as CGGL (Code Generator Generator Language).

  2. Independent rate and temporal coding in hippocampal pyramidal cells

    PubMed Central

    Huxter, John; Burgess, Neil; O’Keefe, John

    2009-01-01

    Hippocampal pyramidal cells use temporal 1 as well as rate coding 2 to signal spatial aspects of the animal’s environment or behaviour. The temporal code takes the form of a phase relationship to the concurrent cycle of the hippocampal EEG theta rhythm (Figure 1​; 1). These two codes could each represent a different variable 3,4. However, this requires that rate and phase can vary independently, in contrast to recent suggestions 5,6 that they are tightly coupled: both reflecting the amplitude of the cell’s input. Here we show that the time of firing and firing rate are dissociable and can represent two independent variables, viz, the animal’s location within the place field and its speed of movement through the field, respectively. Independent encoding of location together with actions and stimuli occurring there may help to explain the dual roles of the hippocampus in spatial and episodic memory 7 8 or a more general role in relational/declarative memory9,10. PMID:14574410

  3. Benchmark testing and independent verification of the VS2DT computer code

    SciTech Connect

    McCord, J.T.; Goodrich, M.T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code`s original documentation.

  4. Benchmark testing and independent verification of the VS2DT computer code

    NASA Astrophysics Data System (ADS)

    McCord, James T.; Goodrich, Michael T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code's original documentation.

  5. Independent coding of wind direction in cockroach giant interneurons.

    PubMed

    Mizrahi, A; Libersat, F

    1997-11-01

    Independent coding of wind direction in cockroach giant interneurons. J. Neurophysiol. 78: 2655-2661, 1997. In this study we examined the possible role of cell-to-cell interactions in the localization processing of a wind stimulus by the cockroach cercal system. Such sensory processing is performed primarily by pairs of giant interneurons (GIs), a group of highly directional cells. We have studied possible interactions among these GIs by comparing the wind sensitivity of a given GI before and after removing another GI with the use of photoablation. Testing various combinations of GI pairs did not reveal any suprathreshold interactions. This was true for all unilateral GI pairs on the left or right side as well as all the bilateral GI pairs (left and right homologues). Those experiments in which we were able to measure synaptic activity did not reveal subthreshold interactions between the GIs either. We conclude that the GIs code independently for a given wind direction without local GI-GI interactions. We discuss the possible implications of the absence of local interactions on information transfer in the first station of the escape circuit.

  6. Independent accident investigation: a modern safety tool.

    PubMed

    Stoop, John A

    2004-07-26

    Historically, safety has been subjected to a fragmented approach. In the past, every department has had its own responsibility towards safety, focusing either on working conditions, internal safety, external safety, rescue and emergency, public order or security. They each issued policy documents, which in their time were leading statements for elaboration and regulation. They also addressed safety issues with tools of various nature, often specifically developed within their domain. Due to a series of major accidents and disasters, the focus of attention is shifting from complying with quantitative risk standards towards intervention in primary operational processes, coping with systemic deficiencies and a more integrated assessment of safety in its societal context. In The Netherlands recognition of the importance of independent investigations has led to an expansion of this philosophy from the transport sector to other sectors. The philosophy now covers transport, industry, defense, natural disaster, environment and health and other major occurrences such as explosions, fires, and collapse of buildings or structures. In 2003 a multi-sector covering law will establish an independent safety board in The Netherlands. At a European level, mandatory investigation agencies are recognized as indispensable safety instruments for aviation, railways and the maritime sector, for which EU Directives are in place or being progressed [Transport accident and incident investigation in the European Union, European Transport Safety Council, ISBN 90-76024-10-3, Brussel, 2001]. Due to a series of major events, attention has been drawn to the consequences of disasters, highlighting the involvement of rescue and emergency services. They also have become subjected to investigative efforts, which in return, puts demands on investigation methodology. This paper comments on an evolutionary development in safety thinking and of safety boards, highlighting some consequences for strategic

  7. Genetic code deviations in the ciliates: evidence for multiple and independent events.

    PubMed Central

    Tourancheau, A B; Tsao, N; Klobutcher, L A; Pearlman, R E; Adoutte, A

    1995-01-01

    In several species of ciliates, the universal stop codons UAA and UAG are translated into glutamine, while in the euplotids, the glutamine codon usage is normal, but UGA appears to be translated as cysteine. Because the emerging position of this monophyletic group in the eukaryotic lineage is relatively late, this deviant genetic code represents a derived state of the universal code. The question is therefore raised as to how these changes arose within the evolutionary pathways of the phylum. Here, we have investigated the presence of stop codons in alpha tubulin and/or phosphoglycerate kinase gene coding sequences from diverse species of ciliates scattered over the phylogenetic tree constructed from 28S rRNA sequences. In our data set, when deviations occur they correspond to in frame UAA and UAG coding for glutamine. By combining these new data with those previously reported, we show that (i) utilization of UAA and UAG codons occurs to different extents between, but also within, the different classes of ciliates and (ii) the resulting phylogenetic pattern of deviations from the universal code cannot be accounted for by a scenario involving a single transition to the unusual code. Thus, contrary to expectations, deviations from the universal genetic code have arisen independently several times within the phylum. PMID:7621837

  8. Investigating the Simulink Auto-Coding Process

    NASA Technical Reports Server (NTRS)

    Gualdoni, Matthew J.

    2016-01-01

    the program; additionally, this is lost time that could be spent testing and analyzing the code. This is one of the more prominent issues with the auto-coding process, and while much information is available with regard to optimizing Simulink designs to produce efficient and reliable C++ code, not much research has been made public on how to reduce the code generation time. It is of interest to develop some insight as to what causes code generation times to be so significant, and determine if there are architecture guidelines or a desirable auto-coding configuration set to assist in streamlining this step of the design process for particular applications. To address the issue at hand, the Simulink coder was studied at a foundational level. For each different component type made available by the software, the features, auto-code generation time, and the format of the generated code were analyzed and documented. Tools were developed and documented to expedite these studies, particularly in the area of automating sequential builds to ensure accurate data was obtained. Next, the Ramses model was examined in an attempt to determine the composition and the types of technologies used in the model. This enabled the development of a model that uses similar technologies, but takes a fraction of the time to auto-code to reduce the turnaround time for experimentation. Lastly, the model was used to run a wide array of experiments and collect data to obtain knowledge about where to search for bottlenecks in the Ramses model. The resulting contributions of the overall effort consist of an experimental model for further investigation into the subject, as well as several automation tools to assist in analyzing the model, and a reference document offering insight to the auto-coding process, including documentation of the tools used in the model analysis, data illustrating some potential problem areas in the auto-coding process, and recommendations on areas or practices in the current

  9. Implementation of context independent code on a new array processor: The Super-65

    NASA Technical Reports Server (NTRS)

    Colbert, R. O.; Bowhill, S. A.

    1981-01-01

    The feasibility of rewriting standard uniprocessor programs into code which contains no context-dependent branches is explored. Context independent code (CIC) would contain no branches that might require different processing elements to branch different ways. In order to investigate the possibilities and restrictions of CIC, several programs were recoded into CIC and a four-element array processor was built. This processor (the Super-65) consisted of three 6502 microprocessors and the Apple II microcomputer. The results obtained were somewhat dependent upon the specific architecture of the Super-65 but within bounds, the throughput of the array processor was found to increase linearly with the number of processing elements (PEs). The slope of throughput versus PEs is highly dependent on the program and varied from 0.33 to 1.00 for the sample programs.

  10. Two independent transcription initiation codes overlap on vertebrate core promoters.

    PubMed

    Haberle, Vanja; Li, Nan; Hadzhiev, Yavor; Plessy, Charles; Previti, Christopher; Nepal, Chirag; Gehrig, Jochen; Dong, Xianjun; Akalin, Altuna; Suzuki, Ana Maria; van IJcken, Wilfred F J; Armant, Olivier; Ferg, Marco; Strähle, Uwe; Carninci, Piero; Müller, Ferenc; Lenhard, Boris

    2014-03-20

    A core promoter is a stretch of DNA surrounding the transcription start site (TSS) that integrates regulatory inputs and recruits general transcription factors to initiate transcription. The nature and causative relationship of the DNA sequence and chromatin signals that govern the selection of most TSSs by RNA polymerase II remain unresolved. Maternal to zygotic transition represents the most marked change of the transcriptome repertoire in the vertebrate life cycle. Early embryonic development in zebrafish is characterized by a series of transcriptionally silent cell cycles regulated by inherited maternal gene products: zygotic genome activation commences at the tenth cell cycle, marking the mid-blastula transition. This transition provides a unique opportunity to study the rules of TSS selection and the hierarchy of events linking transcription initiation with key chromatin modifications. We analysed TSS usage during zebrafish early embryonic development at high resolution using cap analysis of gene expression, and determined the positions of H3K4me3-marked promoter-associated nucleosomes. Here we show that the transition from the maternal to zygotic transcriptome is characterized by a switch between two fundamentally different modes of defining transcription initiation, which drive the dynamic change of TSS usage and promoter shape. A maternal-specific TSS selection, which requires an A/T-rich (W-box) motif, is replaced with a zygotic TSS selection grammar characterized by broader patterns of dinucleotide enrichments, precisely aligned with the first downstream (+1) nucleosome. The developmental dynamics of the H3K4me3-marked nucleosomes reveal their DNA-sequence-associated positioning at promoters before zygotic transcription and subsequent transcription-independent adjustment to the final position downstream of the zygotic TSS. The two TSS-defining grammars coexist, often physically overlapping, in core promoters of constitutively expressed genes to enable

  11. Two independent transcription initiation codes overlap on vertebrate core promoters

    NASA Astrophysics Data System (ADS)

    Haberle, Vanja; Li, Nan; Hadzhiev, Yavor; Plessy, Charles; Previti, Christopher; Nepal, Chirag; Gehrig, Jochen; Dong, Xianjun; Akalin, Altuna; Suzuki, Ana Maria; van Ijcken, Wilfred F. J.; Armant, Olivier; Ferg, Marco; Strähle, Uwe; Carninci, Piero; Müller, Ferenc; Lenhard, Boris

    2014-03-01

    A core promoter is a stretch of DNA surrounding the transcription start site (TSS) that integrates regulatory inputs and recruits general transcription factors to initiate transcription. The nature and causative relationship of the DNA sequence and chromatin signals that govern the selection of most TSSs by RNA polymerase II remain unresolved. Maternal to zygotic transition represents the most marked change of the transcriptome repertoire in the vertebrate life cycle. Early embryonic development in zebrafish is characterized by a series of transcriptionally silent cell cycles regulated by inherited maternal gene products: zygotic genome activation commences at the tenth cell cycle, marking the mid-blastula transition. This transition provides a unique opportunity to study the rules of TSS selection and the hierarchy of events linking transcription initiation with key chromatin modifications. We analysed TSS usage during zebrafish early embryonic development at high resolution using cap analysis of gene expression, and determined the positions of H3K4me3-marked promoter-associated nucleosomes. Here we show that the transition from the maternal to zygotic transcriptome is characterized by a switch between two fundamentally different modes of defining transcription initiation, which drive the dynamic change of TSS usage and promoter shape. A maternal-specific TSS selection, which requires an A/T-rich (W-box) motif, is replaced with a zygotic TSS selection grammar characterized by broader patterns of dinucleotide enrichments, precisely aligned with the first downstream (+1) nucleosome. The developmental dynamics of the H3K4me3-marked nucleosomes reveal their DNA-sequence-associated positioning at promoters before zygotic transcription and subsequent transcription-independent adjustment to the final position downstream of the zygotic TSS. The two TSS-defining grammars coexist, often physically overlapping, in core promoters of constitutively expressed genes to enable

  12. Independent coding of absolute duration and distance magnitudes in the prefrontal cortex.

    PubMed

    Marcos, Encarni; Tsujimoto, Satoshi; Genovesio, Aldo

    2017-01-01

    The estimation of space and time can interfere with each other, and neuroimaging studies have shown overlapping activation in the parietal and prefrontal cortical areas. We used duration and distance discrimination tasks to determine whether space and time share resources in prefrontal cortex (PF) neurons. Monkeys were required to report which of two stimuli, a red circle or blue square, presented sequentially, were longer and farther, respectively, in the duration and distance tasks. In a previous study, we showed that relative duration and distance are coded by different populations of neurons and that the only common representation is related to goal coding. Here, we examined the coding of absolute duration and distance. Our results support a model of independent coding of absolute duration and distance metrics by demonstrating that not only relative magnitude but also absolute magnitude are independently coded in the PF.

  13. An investigation of error characteristics and coding performance

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1993-01-01

    The first year's effort on NASA Grant NAG5-2006 was an investigation to characterize typical errors resulting from the EOS dorn link. The analysis methods developed for this effort were used on test data from a March 1992 White Sands Terminal Test. The effectiveness of a concatenated coding scheme of a Reed Solomon outer code and a convolutional inner code versus a Reed Solomon only code scheme has been investigated as well as the effectiveness of a Periodic Convolutional Interleaver in dispersing errors of certain types. The work effort consisted of development of software that allows simulation studies with the appropriate coding schemes plus either simulated data with errors or actual data with errors. The software program is entitled Communication Link Error Analysis (CLEAN) and models downlink errors, forward error correcting schemes, and interleavers.

  14. Independent verification and validation testing of the FLASH computer code, Versiion 3. 0

    SciTech Connect

    Martian, P.; Chung, J.N. . Dept. of Mechanical and Materials Engineering)

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies.

  15. Independent validation testing of the FLAME computer code, Version 1. 0

    SciTech Connect

    Martian, P.; Chung, J.N. . Dept. of Mechanical and Materials Engineering)

    1992-07-01

    Independent testing of the FLAME computer code, Version 1.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Validation tests, (i.e., tests which compare field data to the computer generated solutions) were used to determine the operational status of the FLAME computer code and were done on a qualitative basis through graphical comparisons of the experimental and numerical data. These tests were specifically designed to check: (1) correctness of the FORTRAN coding, (2) computational accuracy, and (3) suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: (1) independent applications, and (2) graduated difficulty of test cases. Three tests ranging in complexity from simple one-dimensional steady-state flow field problems under near-saturated conditions to two-dimensional transient flow problems with very dry initial conditions.

  16. Are interaural time and level differences represented by independent or integrated codes in the human auditory cortex?

    PubMed

    Edmonds, Barrie A; Krumbholz, Katrin

    2014-02-01

    Sound localization is important for orienting and focusing attention and for segregating sounds from different sources in the environment. In humans, horizontal sound localization mainly relies on interaural differences in sound arrival time and sound level. Despite their perceptual importance, the neural processing of interaural time and level differences (ITDs and ILDs) remains poorly understood. Animal studies suggest that, in the brainstem, ITDs and ILDs are processed independently by different specialized circuits. The aim of the current study was to investigate whether, at higher processing levels, they remain independent or are integrated into a common code of sound laterality. For that, we measured late auditory cortical potentials in response to changes in sound lateralization elicited by perceptually matched changes in ITD and/or ILD. The responses to the ITD and ILD changes exhibited significant morphological differences. At the same time, however, they originated from overlapping areas of the cortex and showed clear evidence for functional coupling. These results suggest that the auditory cortex contains an integrated code of sound laterality, but also retains independent information about ITD and ILD cues. This cue-related information might be used to assess how consistent the cues are, and thus, how likely they would have arisen from the same source.

  17. Proof-of-principle experiment of reference-frame-independent quantum key distribution with phase coding

    PubMed Central

    Liang, Wen-Ye; Wang, Shuang; Li, Hong-Wei; Yin, Zhen-Qiang; Chen, Wei; Yao, Yao; Huang, Jing-Zheng; Guo, Guang-Can; Han, Zheng-Fu

    2014-01-01

    We have demonstrated a proof-of-principle experiment of reference-frame-independent phase coding quantum key distribution (RFI-QKD) over an 80-km optical fiber. After considering the finite-key bound, we still achieve a distance of 50 km. In this scenario, the phases of the basis states are related by a slowly time-varying transformation. Furthermore, we developed and realized a new decoy state method for RFI-QKD systems with weak coherent sources to counteract the photon-number-splitting attack. With the help of a reference-frame-independent protocol and a Michelson interferometer with Faraday rotator mirrors, our system is rendered immune to the slow phase changes of the interferometer and the polarization disturbances of the channel, making the procedure very robust. PMID:24402550

  18. Proof-of-principle experiment of reference-frame-independent quantum key distribution with phase coding.

    PubMed

    Liang, Wen-Ye; Wang, Shuang; Li, Hong-Wei; Yin, Zhen-Qiang; Chen, Wei; Yao, Yao; Huang, Jing-Zheng; Guo, Guang-Can; Han, Zheng-Fu

    2014-01-09

    We have demonstrated a proof-of-principle experiment of reference-frame-independent phase coding quantum key distribution (RFI-QKD) over an 80-km optical fiber. After considering the finite-key bound, we still achieve a distance of 50 km. In this scenario, the phases of the basis states are related by a slowly time-varying transformation. Furthermore, we developed and realized a new decoy state method for RFI-QKD systems with weak coherent sources to counteract the photon-number-splitting attack. With the help of a reference-frame-independent protocol and a Michelson interferometer with Faraday rotator mirrors, our system is rendered immune to the slow phase changes of the interferometer and the polarization disturbances of the channel, making the procedure very robust.

  19. Proof-of-principle experiment of reference-frame-independent quantum key distribution with phase coding

    NASA Astrophysics Data System (ADS)

    Liang, Wen-Ye; Wang, Shuang; Li, Hong-Wei; Yin, Zhen-Qiang; Chen, Wei; Yao, Yao; Huang, Jing-Zheng; Guo, Guang-Can; Han, Zheng-Fu

    2014-01-01

    We have demonstrated a proof-of-principle experiment of reference-frame-independent phase coding quantum key distribution (RFI-QKD) over an 80-km optical fiber. After considering the finite-key bound, we still achieve a distance of 50 km. In this scenario, the phases of the basis states are related by a slowly time-varying transformation. Furthermore, we developed and realized a new decoy state method for RFI-QKD systems with weak coherent sources to counteract the photon-number-splitting attack. With the help of a reference-frame-independent protocol and a Michelson interferometer with Faraday rotator mirrors, our system is rendered immune to the slow phase changes of the interferometer and the polarization disturbances of the channel, making the procedure very robust.

  20. The investigation of bandwidth efficient coding and modulation techniques

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The New Mexico State University Center for Space Telemetering and Telecommunications systems has been, and is currently, engaged in the investigation of trellis-coded modulation (TCM) communication systems. In particular, TCM utilizing M-ary phase shift keying is being studied. The study of carrier synchronization in a TCM environment, or in MPSK systems in general, has been one of the two main thrusts of this grant. This study has involved both theoretical modelling and software simulation of the carrier synchronization problem.

  1. A 2.9 ps equivalent resolution interpolating time counter based on multiple independent coding lines

    NASA Astrophysics Data System (ADS)

    Szplet, R.; Jachna, Z.; Kwiatkowski, P.; Rozyc, K.

    2013-03-01

    We present the design, operation and test results of a time counter that has an equivalent resolution of 2.9 ps, a measurement uncertainty at the level of 6 ps, and a measurement range of 10 s. The time counter has been implemented in a general-purpose reprogrammable device Spartan-6 (Xilinx). To obtain both high precision and wide measurement range the counting of periods of a reference clock is combined with a two-stage interpolation within a single period of the clock signal. The interpolation involves a four-phase clock in the first interpolation stage (FIS) and an equivalent coding line (ECL) in the second interpolation stage (SIS). The ECL is created as a compound of independent discrete time coding lines (TCL). The number of TCLs used to create the virtual ECL has an effect on its resolution. We tested ECLs made from up to 16 TCLs, but the idea may be extended to a larger number of lines. In the presented time counter the coarse resolution of the counting method equal to 2 ns (period of the 500 MHz reference clock) is firstly improved fourfold in the FIS and next even more than 400 times in the SIS. The proposed solution allows us to overcome the technological limitation in achievable resolution and improve the precision of conversion of integrated interpolators based on tapped delay lines.

  2. Independent assessment of TRAC and RELAP5 codes through separate effects tests

    SciTech Connect

    Saha, P.; Rohatgi, U.S.; Jo, J.H.; Neymotin, L.; Slovik, G.; Yuelys-Miksis, C.; Pu, J.

    1983-01-01

    Independent assessment of TRAC-PF1 (Version 7.0), TRAC-BD1 (Version 12.0) and RELAP5/MOD1 (Cycle 14) that was initiated at BNL in FY 1982, has been completed in FY 1983. As in the previous years, emphasis at Brookhaven has been in simulating various separate-effects tests with these advanced codes and identifying the areas where further thermal-hydraulic modeling improvements are needed. The following six catetories of tests were simulated with the above codes: (1) critical flow tests (Moby-Dick nitrogen-water, BNL flashing flow, Marviken Test 24); (2) Counter-Current Flow Limiting (CCFL) tests (University of Houston, Dartmouth College single and parallel tube test); (3) level swell tests (G.E. large vessel test); (4) steam generator tests (B and W 19-tube model S.G. tests, FLECHT-SEASET U-tube S.G. tests); (5) natural circulation tests (FRIGG loop tests); and (6) post-CHF tests (Oak Ridge steady-state test).

  3. Independent code assessment at BNL in FY 1982. [TRAC-PF1; RELAP5/MOD1; TRAC-BD1

    SciTech Connect

    Saha, P.; Rohatgi, U.S.; Jo, J.H.; Neymotin, L.; Slovik, G.; Yuelys-Miksis, C.

    1982-01-01

    Independent assessment of the advanced codes such as TRAC and RELAP5 has continued at BNL through the Fiscal Year 1982. The simulation tests can be grouped into the following five categories: critical flow, counter-current flow limiting (CCFL) or flooding, level swell, steam generator thermal performance, and natural circulation. TRAC-PF1 (Version 7.0) and RELAP5/MOD1 (Cycle 14) codes were assessed by simulating all of the above experiments, whereas the TRAC-BD1 (Version 12.0) code was applied only to the CCFL tests. Results and conclusions of the BNL code assessment activity of FY 1982 are summarized below.

  4. Independent coding of target distance and direction in visuo-spatial working memory.

    PubMed

    Chieffi, S; Allport, D A

    1997-01-01

    The organization of manual reaching movements suggests considerable independence in the initial programming with respect to the direction and the distance of the intended movement. It was hypothesized that short-term memory for a visually-presented location within reaching space, in the absence of other allocentric reference points, might also be represented in a motoric code, showing similar independence in the encoding of direction and distance. This hypothesis was tested in two experiments, using adult human subjects who were required to remember the location of a briefly presented luminous spot. Stimuli were presented in the dark, thus providing purely egocentric spatial information. After the specified delay, subjects were instructed to point to the remembered location. In Exp. 1, temporal decay of location memory was studied, over a range of 4-30 s. The results showed that (a) memory for both the direction and the distance of the visual target location declined over time, at about the same rate for both parameters; however, (b) errors of distance were much greater in the left than in the right hemispace, whereas direction errors showed no such effect; (c) the distance and direction errors were essentially uncorrelated, at all delays. These findings suggest independent representation of these two parameters in working memory. In Exp. 2 the subjects were required to remember the locations of two visual stimuli presented sequentially, one after the other. Only after both stimuli had been presented did the subject receive a signal from the experimenter as to which one was to be pointed to. The results showed that the encoding of a second location selectively interfered with memory for the direction but not for the distance of the to-be-remembered target location. As in Exp. 1, direction and distance errors were again uncorrelated. The results of both experiments indicate that memory for egocentrically-specified visual locations can encode the direction and

  5. Board Governance of Independent Schools: A Framework for Investigation

    ERIC Educational Resources Information Center

    McCormick, John; Barnett, Kerry; Alavi, Seyyed Babak; Newcombe, Geoffrey

    2006-01-01

    Purpose: This paper develops a theoretical framework to guide future inquiry into board governance of independent schools. Design/methodology/approach: The authors' approach is to integrate literatures related to corporate and educational boards, motivation, leadership and group processes that are appropriate for conceptualizing independent school…

  6. Norepinephrine Modulates Coding of Complex Vocalizations in the Songbird Auditory Cortex Independent of Local Neuroestrogen Synthesis.

    PubMed

    Ikeda, Maaya Z; Jeon, Sung David; Cowell, Rosemary A; Remage-Healey, Luke

    2015-06-24

    The catecholamine norepinephrine plays a significant role in auditory processing. Most studies to date have examined the effects of norepinephrine on the neuronal response to relatively simple stimuli, such as tones and calls. It is less clear how norepinephrine shapes the detection of complex syntactical sounds, as well as the coding properties of sensory neurons. Songbirds provide an opportunity to understand how auditory neurons encode complex, learned vocalizations, and the potential role of norepinephrine in modulating the neuronal computations for acoustic communication. Here, we infused norepinephrine into the zebra finch auditory cortex and performed extracellular recordings to study the modulation of song representations in single neurons. Consistent with its proposed role in enhancing signal detection, norepinephrine decreased spontaneous activity and firing during stimuli, yet it significantly enhanced the auditory signal-to-noise ratio. These effects were all mimicked by clonidine, an α-2 receptor agonist. Moreover, a pattern classifier analysis indicated that norepinephrine enhanced the ability of single neurons to accurately encode complex auditory stimuli. Because neuroestrogens are also known to enhance auditory processing in the songbird brain, we tested the hypothesis that norepinephrine actions depend on local estrogen synthesis. Neither norepinephrine nor adrenergic receptor antagonist infusion into the auditory cortex had detectable effects on local estradiol levels. Moreover, pretreatment with fadrozole, a specific aromatase inhibitor, did not block norepinephrine's neuromodulatory effects. Together, these findings indicate that norepinephrine enhances signal detection and information encoding for complex auditory stimuli by suppressing spontaneous "noise" activity and that these actions are independent of local neuroestrogen synthesis.

  7. High performance computing aspects of a dimension independent semi-Lagrangian discontinuous Galerkin code

    NASA Astrophysics Data System (ADS)

    Einkemmer, Lukas

    2016-05-01

    The recently developed semi-Lagrangian discontinuous Galerkin approach is used to discretize hyperbolic partial differential equations (usually first order equations). Since these methods are conservative, local in space, and able to limit numerical diffusion, they are considered a promising alternative to more traditional semi-Lagrangian schemes (which are usually based on polynomial or spline interpolation). In this paper, we consider a parallel implementation of a semi-Lagrangian discontinuous Galerkin method for distributed memory systems (so-called clusters). Both strong and weak scaling studies are performed on the Vienna Scientific Cluster 2 (VSC-2). In the case of weak scaling we observe a parallel efficiency above 0.8 for both two and four dimensional problems and up to 8192 cores. Strong scaling results show good scalability to at least 512 cores (we consider problems that can be run on a single processor in reasonable time). In addition, we study the scaling of a two dimensional Vlasov-Poisson solver that is implemented using the framework provided. All of the simulations are conducted in the context of worst case communication overhead; i.e., in a setting where the CFL (Courant-Friedrichs-Lewy) number increases linearly with the problem size. The framework introduced in this paper facilitates a dimension independent implementation of scientific codes (based on C++ templates) using both an MPI and a hybrid approach to parallelization. We describe the essential ingredients of our implementation.

  8. The role of stress on the language-independence and code-switching phenomena.

    PubMed

    Javier, R A; Marcos, L R

    1989-09-01

    This investigation studies the extent to which stress affects the assumed functional separation of coordinate bilingual's linguistic organization. Spanish/English bilinguals were subjects in a GSR linguistic conditioning experiment using two intensities of buzzer sounds (stressful conditions) and two lists of words. One word for each list functioned as the conditioned stimulus. Generalization to semantically, phonemically, and unrelated words occurred in both languages and buzzer conditions. We found a differential impact of the buzzer on the functional separation of the languages, although not in the direction predicted. We concluded that stress produced code-switching, and hence, a primitivization of the subject's cognitive and linguistic functioning is assumed to have occurred. These findings are important in understanding the way stress affects the bilingual's languages at the linguistic and cognitive levels. They are also important in understanding the role of stress in language development and in the transfer of linguistic information.

  9. Characterization and validation of a Monte Carlo code for independent dose calculation in proton therapy treatments with pencil beam scanning

    NASA Astrophysics Data System (ADS)

    Fracchiolla, F.; Lorentini, S.; Widesott, L.; Schwarz, M.

    2015-11-01

    We propose a method of creating and validating a Monte Carlo (MC) model of a proton Pencil Beam Scanning (PBS) machine using only commissioning measurements and avoiding the nozzle modeling. Measurements with a scintillating screen coupled with a CCD camera, ionization chamber and a Faraday Cup were used to model the beam in TOPAS without using any machine parameter information but the virtual source distance from the isocenter. Then the model was validated on simple Spread Out Bragg Peaks (SOBP) delivered in water phantom and with six realistic clinical plans (many involving 3 or more fields) on an anthropomorphic phantom. In particular the behavior of the moveable Range Shifter (RS) feature was investigated and its modeling has been proposed. The gamma analysis (3%,3 mm) was used to compare MC, TPS (XiO-ELEKTA) and measured 2D dose distributions (using radiochromic film). The MC modeling proposed here shows good results in the validation phase, both for simple irradiation geometry (SOBP in water) and for modulated treatment fields (on anthropomorphic phantoms). In particular head lesions were investigated and both MC and TPS data were compared with measurements. Treatment plans with no RS always showed a very good agreement with both of them (γ -Passing Rate (PR)  >  95%). Treatment plans in which the RS was needed were also tested and validated. For these treatment plans MC results showed better agreement with measurements (γ -PR  >  93%) than the one coming from TPS (γ -PR  <  88%). This work shows how to simplify the MC modeling of a PBS machine for proton therapy treatments without accounting for any hardware components and proposes a more reliable RS modeling than the one implemented in our TPS. The validation process has shown how this code is a valid candidate for a completely independent treatment plan dose calculation algorithm. This makes the code an important future tool for the patient specific QA verification process.

  10. Characterization and validation of a Monte Carlo code for independent dose calculation in proton therapy treatments with pencil beam scanning.

    PubMed

    Fracchiolla, F; Lorentini, S; Widesott, L; Schwarz, M

    2015-11-07

    We propose a method of creating and validating a Monte Carlo (MC) model of a proton Pencil Beam Scanning (PBS) machine using only commissioning measurements and avoiding the nozzle modeling. Measurements with a scintillating screen coupled with a CCD camera, ionization chamber and a Faraday Cup were used to model the beam in TOPAS without using any machine parameter information but the virtual source distance from the isocenter. Then the model was validated on simple Spread Out Bragg Peaks (SOBP) delivered in water phantom and with six realistic clinical plans (many involving 3 or more fields) on an anthropomorphic phantom. In particular the behavior of the moveable Range Shifter (RS) feature was investigated and its modeling has been proposed. The gamma analysis (3%,3 mm) was used to compare MC, TPS (XiO-ELEKTA) and measured 2D dose distributions (using radiochromic film). The MC modeling proposed here shows good results in the validation phase, both for simple irradiation geometry (SOBP in water) and for modulated treatment fields (on anthropomorphic phantoms). In particular head lesions were investigated and both MC and TPS data were compared with measurements. Treatment plans with no RS always showed a very good agreement with both of them (γ-Passing Rate (PR)  >  95%). Treatment plans in which the RS was needed were also tested and validated. For these treatment plans MC results showed better agreement with measurements (γ-PR  >  93%) than the one coming from TPS (γ-PR  <  88%). This work shows how to simplify the MC modeling of a PBS machine for proton therapy treatments without accounting for any hardware components and proposes a more reliable RS modeling than the one implemented in our TPS. The validation process has shown how this code is a valid candidate for a completely independent treatment plan dose calculation algorithm. This makes the code an important future tool for the patient specific QA verification process.

  11. Norepinephrine Modulates Coding of Complex Vocalizations in the Songbird Auditory Cortex Independent of Local Neuroestrogen Synthesis

    PubMed Central

    Ikeda, Maaya Z.; Jeon, Sung David; Cowell, Rosemary A.

    2015-01-01

    The catecholamine norepinephrine plays a significant role in auditory processing. Most studies to date have examined the effects of norepinephrine on the neuronal response to relatively simple stimuli, such as tones and calls. It is less clear how norepinephrine shapes the detection of complex syntactical sounds, as well as the coding properties of sensory neurons. Songbirds provide an opportunity to understand how auditory neurons encode complex, learned vocalizations, and the potential role of norepinephrine in modulating the neuronal computations for acoustic communication. Here, we infused norepinephrine into the zebra finch auditory cortex and performed extracellular recordings to study the modulation of song representations in single neurons. Consistent with its proposed role in enhancing signal detection, norepinephrine decreased spontaneous activity and firing during stimuli, yet it significantly enhanced the auditory signal-to-noise ratio. These effects were all mimicked by clonidine, an α-2 receptor agonist. Moreover, a pattern classifier analysis indicated that norepinephrine enhanced the ability of single neurons to accurately encode complex auditory stimuli. Because neuroestrogens are also known to enhance auditory processing in the songbird brain, we tested the hypothesis that norepinephrine actions depend on local estrogen synthesis. Neither norepinephrine nor adrenergic receptor antagonist infusion into the auditory cortex had detectable effects on local estradiol levels. Moreover, pretreatment with fadrozole, a specific aromatase inhibitor, did not block norepinephrine's neuromodulatory effects. Together, these findings indicate that norepinephrine enhances signal detection and information encoding for complex auditory stimuli by suppressing spontaneous “noise” activity and that these actions are independent of local neuroestrogen synthesis. PMID:26109659

  12. Investigation of Navier-Stokes Code Verification and Design Optimization

    NASA Technical Reports Server (NTRS)

    Vaidyanathan, Rajkumar

    2004-01-01

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization

  13. Coding tools investigation for next generation video coding based on HEVC

    NASA Astrophysics Data System (ADS)

    Chen, Jianle; Chen, Ying; Karczewicz, Marta; Li, Xiang; Liu, Hongbin; Zhang, Li; Zhao, Xin

    2015-09-01

    The new state-of-the-art video coding standard, H.265/HEVC, has been finalized in 2013 and it achieves roughly 50% bit rate saving compared to its predecessor, H.264/MPEG-4 AVC. This paper provides the evidence that there is still potential for further coding efficiency improvements. A brief overview of HEVC is firstly given in the paper. Then, our improvements on each main module of HEVC are presented. For instance, the recursive quadtree block structure is extended to support larger coding unit and transform unit. The motion information prediction scheme is improved by advanced temporal motion vector prediction, which inherits the motion information of each small block within a large block from a temporal reference picture. Cross component prediction with linear prediction model improves intra prediction and overlapped block motion compensation improves the efficiency of inter prediction. Furthermore, coding of both intra and inter prediction residual is improved by adaptive multiple transform technique. Finally, in addition to deblocking filter and SAO, adaptive loop filter is applied to further enhance the reconstructed picture quality. This paper describes above-mentioned techniques in detail and evaluates their coding performance benefits based on the common test condition during HEVC development. The simulation results show that significant performance improvement over HEVC standard can be achieved, especially for the high resolution video materials.

  14. Field Dependence/Independence Cognitive Style and Problem Posing: An Investigation with Sixth Grade Students

    ERIC Educational Resources Information Center

    Nicolaou, Aristoklis Andreas; Xistouri, Xenia

    2011-01-01

    Field dependence/independence cognitive style was found to relate to general academic achievement and specific areas of mathematics; in the majority of studies, field-independent students were found to be superior to field-dependent students. The present study investigated the relationship between field dependence/independence cognitive style and…

  15. A Social Cognitive Investigation of Australian Independent School Boards as Teams

    ERIC Educational Resources Information Center

    Krishnan, Aparna; Barnett, Kerry; McCormick, John; Newcombe, Geoffrey

    2016-01-01

    Purpose: The purpose of this paper is to investigate independent school Boards as teams using a social cognitive perspective. Specifically, the study investigated Board processes and the nature of relationships between Board member self-efficacy, Board collective efficacy and performance of independent school Boards in New South Wales, Australia.…

  16. Investigation of Bandwidth-Efficient Coding and Modulation Techniques

    NASA Technical Reports Server (NTRS)

    Osborne, William P.

    1992-01-01

    The necessary technology was studied to improve the bandwidth efficiency of the space-to-ground communications network using the current capabilities of that network as a baseline. The study was aimed at making space payloads, for example the Hubble Space Telescope, more capable without the need to completely redesign the link. Particular emphasis was placed on the following concepts: (1) what the requirements are which are necessary to convert an existing standard 4-ary phase shift keying communications link to one that can support, as a minimum, 8-ary phase shift keying with error corrections applied; and (2) to determine the feasibility of using the existing equipment configurations with additional signal processing equipment to realize the higher order modulation and coding schemes.

  17. Investigations with methanobacteria and with evolution of the genetic code

    NASA Technical Reports Server (NTRS)

    Jukes, T. H.

    1986-01-01

    Mycoplasma capricolum was found by Osawa et al. to use UGA as the code of tryptophan and to contain 75% A + T in its DNA. This change could have been from evolutionary pressure to replace C + G by A + T. Numerous studies have been reported of evolution of proteins as measured by amino acid replacements that are observed when homologus proteins, such as hemoglobins from various vertebrates, are compared. These replacements result from nucleotide substitutions in amino acid codons in the corresponding genes. Simultaneously, silent nucleotide substitutions take place that can be studied when sequences of the genes are compared. These silent evolutionary changes take place mostly in third positions of codons. Two types of nucleotide substitutions are recognized: pyrimidine-pyrimidine and purine-purine interchanges (transitions) and pyriidine-purine interchanges (transversions). Silent transitions are favored when a corresponding transversion would produce an amino acid replacement. Conversely, silent transversions are favored by probability when transitions and transversions will both be silent. Extensive examples of these situations have been found in protein genes, and it is evident that transversions in silent positions predominate in family boxes in most of the examples studied. In associated research a streptomycete from cow manure was found to produce an extracellular enzyme capable of lysing the pseudomurein-contining methanogen Methanobacterium formicicum.

  18. Long non-coding RNA LOC283070 mediates the transition of LNCaP cells into androgen-independent cells possibly via CAMK1D

    PubMed Central

    Wang, Lina; Lin, Yani; Meng, Hui; Liu, Chunyan; Xue, Jing; Zhang, Qi; Li, Chaoyang; Zhang, Pengju; Cui, Fuai; Chen, Weiwen; Jiang, Anli

    2016-01-01

    Aims: The present study is to investigate the role of long non-coding RNAs (lncRNAs) in the development of androgen independence in prostate cancer and its underlying mechanism. Methods: We established an androgen-independent prostate carcinoma (AIPC) cell line LNCaP-AI from androgen-dependent prostate carcinoma (ADPC) cell line LNCaP. Different expression profiles of lncRNAs and mRNAs between LNCaP and LNCaP-AI cells were investigated using microarray analysis. The expression of RNAs was determined using quantitative real-time polymerase chain reaction. Protein levels were measured using Western blotting. MTT assay was used to test cell viability. Tumor formation assay was performed in nude mice to detect tumor growth in vivo. Flow cytometry was performed to detect cell cycles. Transwell assay was employed to test cell migration and invasion. Results: According to bioinformatics prediction, lncRNA LOC283070 could possibly play an important role in the transition of LNCaP cells into LNCaP-AI cells. LOC283070 was up-regulated in LNCaP-AI cells and frequently up-regulated in AIPC cell lines. Overexpression of LOC283070 in LNCaP cells accelerated cell proliferation and migration, even under androgen-independent circumstances. Knockdown of LOC283070 inhibited LNCaP-AI cell proliferation and migration. Moreover, overexpression of LOC283070 promoted tumor growth in vivo in both normal mice and castrated mice. CAMK1D overexpression had similar effect with LOC283070, and CAMK1D knockdown fully abrogated the effect of LOC283070 overexpression on the transition of LNCaP cells into androgen-independent cells. Conclusions: The present study shows that overexpression of LOC283070 mediates the transition of LNCaP cells into androgen-independent LNCaP-AI cells possibly via CAMK1D. PMID:28077997

  19. Semi-device-independent randomness expansion with partially free random sources using 3 →1 quantum random access code

    NASA Astrophysics Data System (ADS)

    Zhou, Yu-Qian; Gao, Fei; Li, Dan-Dan; Li, Xin-Hui; Wen, Qiao-Yan

    2016-09-01

    We have proved that new randomness can be certified by partially free sources using 2 →1 quantum random access code (QRAC) in the framework of semi-device-independent (SDI) protocols [Y.-Q. Zhou, H.-W. Li, Y.-K. Wang, D.-D. Li, F. Gao, and Q.-Y. Wen, Phys. Rev. A 92, 022331 (2015), 10.1103/PhysRevA.92.022331]. To improve the effectiveness of the randomness generation, here we propose the SDI randomness expansion using 3 →1 QRAC and obtain the corresponding classical and quantum bounds of the two-dimensional quantum witness. Moreover, we get the condition which should be satisfied by the partially free sources to successfully certify new randomness, and the analytic relationship between the certified randomness and the two-dimensional quantum witness violation.

  20. Modality independence of order coding in working memory: Evidence from cross-modal order interference at recall.

    PubMed

    Vandierendonck, André

    2016-01-01

    Working memory researchers do not agree on whether order in serial recall is encoded by dedicated modality-specific systems or by a more general modality-independent system. Although previous research supports the existence of autonomous modality-specific systems, it has been shown that serial recognition memory is prone to cross-modal order interference by concurrent tasks. The present study used a serial recall task, which was performed in a single-task condition and in a dual-task condition with an embedded memory task in the retention interval. The modality of the serial task was either verbal or visuospatial, and the embedded tasks were in the other modality and required either serial or item recall. Care was taken to avoid modality overlaps during presentation and recall. In Experiment 1, visuospatial but not verbal serial recall was more impaired when the embedded task was an order than when it was an item task. Using a more difficult verbal serial recall task, verbal serial recall was also more impaired by another order recall task in Experiment 2. These findings are consistent with the hypothesis of modality-independent order coding. The implications for views on short-term recall and the multicomponent view of working memory are discussed.

  1. Investigating the Language and Literacy Skills Required for Independent Online Learning

    ERIC Educational Resources Information Center

    Silver-Pacuilla, Heidi

    2008-01-01

    This investigation was undertaken to investigate the threshold levels of literacy and language proficiency necessary for adult learners to use the Internet for independent learning. The report is triangulated around learning from large-scale surveys, learning from the literature, and learning from the field. Reported findings include: (1)…

  2. Tips for Young Scientists on the Junior Faculty/Independent Investigator Job Search.

    PubMed

    Martin, Kelsey C

    2017-02-22

    Landing a job as an assistant professor or independent investigator in neuroscience in an academic institution or research institute depends on the accomplishments young scientists make during years of training as graduate students and postdoctoral fellows. This training prepares scientists for an array of exciting job opportunities, one of which is as a faculty member or independent investigator. My goal in this NeuroView is to provide tips about the job search for young scientists who have decided to enter the junior faculty/independent investigator job market. Rather than serving as a "protocol for getting a job," these tips are aimed at providing information that will make each candidate better prepared for her or his job search.

  3. Investigation of the Use of Erasures in a Concatenated Coding Scheme

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Marriott, Philip J.

    1997-01-01

    A new method for declaring erasures in a concatenated coding scheme is investigated. This method is used with the rate 1/2 K = 7 convolutional code and the (255, 223) Reed Solomon code. Errors and erasures Reed Solomon decoding is used. The erasure method proposed uses a soft output Viterbi algorithm and information provided by decoded Reed Solomon codewords in a deinterleaving frame. The results show that a gain of 0.3 dB is possible using a minimum amount of decoding trials.

  4. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Finite Difference Time Domain Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). This manual provides a description of the code and corresponding results for the default scattering problem. In addition to the description, the operation, resource requirements, version A code capabilities, a description of each subroutine, a brief discussion of the radar cross section computations, and a discussion of the scattering results.

  5. Approaches to Learning at Work: Investigating Work Motivation, Perceived Workload, and Choice Independence

    ERIC Educational Resources Information Center

    Kyndt, Eva; Raes, Elisabeth; Dochy, Filip; Janssens, Els

    2013-01-01

    Learning and development are taking up a central role in the human resource policies of organizations because of their crucial contribution to the competitiveness of those organizations. The present study investigates the relationship of work motivation, perceived workload, and choice independence with employees' approaches to learning at work.…

  6. Investigation of the Akt/Pkb Kinase in the Development of Hormone-Independent Prostate Cancer

    DTIC Science & Technology

    2007-02-01

    synthetic androgen R1881 (R1881, pink bar), or 10 µM of the Calbiochem Akt inhibitor (Akt I, yellow bar). Shown is the combination of 3 independent...Facility Name: UT Health Science Center at San Antonio Principal Investigator: Address: _7703 Floyd Curl Drive_ (Signature) _San

  7. Registered report: A coding-independent function of gene and pseudogene mRNAs regulates tumour biology.

    PubMed

    Cowley, Dale; Pandya, Kumar; Khan, Israr; Kerwin, John; Owen, Kate; Griner, Erin

    2015-09-03

    The Reproducibility Project: Cancer Biology seeks to address growing concerns about reproducibility in scientific research by conducting replications of selected experiments from a number of high-profile papers in the field of cancer biology. The papers, which were published between 2010 and 2012, were selected on the basis of citations and Altmetric scores (Errington et al., 2014). This Registered report describes the proposed replication plan of key experiments from 'A coding-independent function of gene and pseudogene mRNAs regulates tumour biology' by Poliseno et al. (2010), published in Nature in 2010. The key experiments to be replicated are reported in Figures 1D, 2F-H, and 4A. In these experiments, Poliseno and colleagues report microRNAs miR-19b and miR-20a transcriptionally suppress both PTEN and PTENP1 in prostate cancer cells (Figure 1D; Poliseno et al., 2010). Decreased expression of PTEN and/or PTENP1 resulted in downregulated PTEN protein levels (Figure 2H), downregulation of both mRNAs (Figure 2G), and increased tumor cell proliferation (Figure 2F; Poliseno et al., 2010). Furthermore, overexpression of the PTEN 3' UTR enhanced PTENP1 mRNA abundance limiting tumor cell proliferation, providing additional evidence for the co-regulation of PTEN and PTENP1 (Figure 4A; Poliseno et al., 2010). The Reproducibility Project: Cancer Biology is collaboration between the Center for Open Science and Science Exchange, and the results of the replications will be published in eLife.

  8. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional numerical electromagnetic scattering codes based upon the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current two dimensional FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem set section, a new problem checklist, references and figure titles.

  9. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional electromagnetic scattering codes based on the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  10. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  11. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain (FDTD) Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain technique. The supplied version of the code is one version of our current three dimensional FDTD code set. The manual provides a description of the code and the corresponding results for the default scattering problem. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version A code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONA.FOR), a section briefly discussing radar cross section (RCS) computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  12. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three-dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain (FDTD) technique. The supplied version of the code is one version of our current three-dimensional FDTD code set. The manual given here provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing radar cross section computations, a section discussing some scattering results, a new problem checklist, references, and figure titles.

  13. Code-Switching in Iranian Elementary EFL Classrooms: An Exploratory Investigation

    ERIC Educational Resources Information Center

    Rezvani, Ehsan; Street, Hezar Jerib; Rasekh, Abbass Eslami

    2011-01-01

    This paper presents the results of a small-scale exploratory investigation of code-switching (CS) between English and Farsi by 4 Iranian English foreign language (EFL) teachers in elementary level EFL classrooms in a language school in Isfahan, Iran. Specifically, the present study aimed at exploring the syntactical identification of switches and…

  14. 78 FR 37571 - Certain Opaque Polymers; Institution of Investigation Pursuant to United States Code

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ... COMMISSION Certain Opaque Polymers; Institution of Investigation Pursuant to United States Code AGENCY: U.S... importation, and the sale within the United States after importation of certain opaque polymers by reason of... importation, or the sale within the United States after importation of certain opaque polymers that...

  15. A Coding System with Independent Annotations of Gesture Forms and Functions during Verbal Communication: Development of a Database of Speech and GEsture (DoSaGE).

    PubMed

    Kong, Anthony Pak-Hin; Law, Sam-Po; Kwan, Connie Ching-Yin; Lai, Christy; Lam, Vivian

    2015-03-01

    Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies in the fact that the coding of forms and functions of gestures has not been clearly differentiated. This paper first described a recently developed Database of Speech and GEsture (DoSaGE) based on independent annotation of gesture forms and functions among 119 neurologically unimpaired right-handed native speakers of Cantonese (divided into three age and two education levels), and presented findings of an investigation examining how gesture use was related to age and linguistic performance. Consideration of these two factors, for which normative data are currently very limited or lacking in the literature, is relevant and necessary when one evaluates gesture employment among individuals with and without language impairment. Three speech tasks, including monologue of a personally important event, sequential description, and story-telling, were used for elicitation. The EUDICO Linguistic ANnotator (ELAN) software was used to independently annotate each participant's linguistic information of the transcript, forms of gestures used, and the function for each gesture. About one-third of the subjects did not use any co-verbal gestures. While the majority of gestures were non-content-carrying, which functioned mainly for reinforcing speech intonation or controlling speech flow, the content-carrying ones were used to enhance speech content. Furthermore, individuals who are younger or linguistically more proficient tended to use fewer gestures, suggesting that normal speakers gesture differently as a function of age and linguistic performance.

  16. A Coding System with Independent Annotations of Gesture Forms and Functions during Verbal Communication: Development of a Database of Speech and GEsture (DoSaGE)

    PubMed Central

    Kong, Anthony Pak-Hin; Law, Sam-Po; Kwan, Connie Ching-Yin; Lai, Christy; Lam, Vivian

    2014-01-01

    Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies in the fact that the coding of forms and functions of gestures has not been clearly differentiated. This paper first described a recently developed Database of Speech and GEsture (DoSaGE) based on independent annotation of gesture forms and functions among 119 neurologically unimpaired right-handed native speakers of Cantonese (divided into three age and two education levels), and presented findings of an investigation examining how gesture use was related to age and linguistic performance. Consideration of these two factors, for which normative data are currently very limited or lacking in the literature, is relevant and necessary when one evaluates gesture employment among individuals with and without language impairment. Three speech tasks, including monologue of a personally important event, sequential description, and story-telling, were used for elicitation. The EUDICO Linguistic ANnotator (ELAN) software was used to independently annotate each participant’s linguistic information of the transcript, forms of gestures used, and the function for each gesture. About one-third of the subjects did not use any co-verbal gestures. While the majority of gestures were non-content-carrying, which functioned mainly for reinforcing speech intonation or controlling speech flow, the content-carrying ones were used to enhance speech content. Furthermore, individuals who are younger or linguistically more proficient tended to use fewer gestures, suggesting that normal speakers gesture differently as a function of age and linguistic performance. PMID:25667563

  17. Investigating the Magnetorotational Instability with Dedalus, and Open-Souce Hydrodynamics Code

    SciTech Connect

    Burns, Keaton J; /UC, Berkeley, aff SLAC

    2012-08-31

    The magnetorotational instability is a fluid instability that causes the onset of turbulence in discs with poloidal magnetic fields. It is believed to be an important mechanism in the physics of accretion discs, namely in its ability to transport angular momentum outward. A similar instability arising in systems with a helical magnetic field may be easier to produce in laboratory experiments using liquid sodium, but the applicability of this phenomenon to astrophysical discs is unclear. To explore and compare the properties of these standard and helical magnetorotational instabilities (MRI and HRMI, respectively), magnetohydrodynamic (MHD) capabilities were added to Dedalus, an open-source hydrodynamics simulator. Dedalus is a Python-based pseudospectral code that uses external libraries and parallelization with the goal of achieving speeds competitive with codes implemented in lower-level languages. This paper will outline the MHD equations as implemented in Dedalus, the steps taken to improve the performance of the code, and the status of MRI investigations using Dedalus.

  18. An early underwater artificial vision model in ocean investigations via independent component analysis.

    PubMed

    Nian, Rui; Liu, Fang; He, Bo

    2013-07-16

    Underwater vision is one of the dominant senses and has shown great prospects in ocean investigations. In this paper, a hierarchical Independent Component Analysis (ICA) framework has been established to explore and understand the functional roles of the higher order statistical structures towards the visual stimulus in the underwater artificial vision system. The model is inspired by characteristics such as the modality, the redundancy reduction, the sparseness and the independence in the early human vision system, which seems to respectively capture the Gabor-like basis functions, the shape contours or the complicated textures in the multiple layer implementations. The simulation results have shown good performance in the effectiveness and the consistence of the approach proposed for the underwater images collected by autonomous underwater vehicles (AUVs).

  19. An Early Underwater Artificial Vision Model in Ocean Investigations via Independent Component Analysis

    PubMed Central

    Nian, Rui; Liu, Fang; He, Bo

    2013-01-01

    Underwater vision is one of the dominant senses and has shown great prospects in ocean investigations. In this paper, a hierarchical Independent Component Analysis (ICA) framework has been established to explore and understand the functional roles of the higher order statistical structures towards the visual stimulus in the underwater artificial vision system. The model is inspired by characteristics such as the modality, the redundancy reduction, the sparseness and the independence in the early human vision system, which seems to respectively capture the Gabor-like basis functions, the shape contours or the complicated textures in the multiple layer implementations. The simulation results have shown good performance in the effectiveness and the consistence of the approach proposed for the underwater images collected by autonomous underwater vehicles (AUVs). PMID:23863855

  20. Coding for stable transmission of W-band radio-over-fiber system using direct-beating of two independent lasers.

    PubMed

    Yang, L G; Sung, J Y; Chow, C W; Yeh, C H; Cheng, K T; Shi, J W; Pan, C L

    2014-10-20

    We demonstrate experimentally Manchester (MC) coding based W-band (75 - 110 GHz) radio-over-fiber (ROF) system to reduce the low-frequency-components (LFCs) signal distortion generated by two independent low-cost lasers using spectral shaping. Hence, a low-cost and higher performance W-band ROF system is achieved. In this system, direct-beating of two independent low-cost CW lasers without frequency tracking circuit (FTC) is used to generate the millimeter-wave. Approaches, such as delayed self-heterodyne interferometer and heterodyne beating are performed to characterize the optical-beating-interference sub-terahertz signal (OBIS). Furthermore, W-band ROF systems using MC coding and NRZ-OOK are compared and discussed.

  1. Medical terminology coding systems and medicolegal death investigation data: searching for a standardized method of electronic coding at a statewide medical examiner's office.

    PubMed

    Lathrop, Sarah L; Davis, Wayland L; Nolte, Kurt B

    2009-01-01

    Medical examiner and coroner reports are a rich source of data for epidemiologic research. To maximize the utility of this information, medicolegal death investigation data need to be electronically coded. In order to determine the best option for coding, we evaluated four different options (Current Procedural Terminology [CPT], International Classification of Disease [ICD] coding, Systematized Nomenclature of Medicine Clinical Terms [SNOMED CT], and an in-house system), then conducted internal and external needs assessments to determine which system best met the needs of a centralized, statewide medical examiner's office. Although all four systems offer distinct advantages and disadvantages, SNOMED CT is the most accurate for coding pathologic diagnoses, with ICD-10 the best option for classifying the cause of death. For New Mexico's Office of the Medical Investigator, the most feasible coding option is an upgrade of an in-house coding system, followed by linkage to ICD codes for cause of death from the New Mexico Bureau of Vital Records and Health Statistics, and ideally, SNOMED classification of pathologic diagnoses.

  2. An investigation on the capabilities of the PENELOPE MC code in nanodosimetry

    SciTech Connect

    Bernal, M. A.; Liendo, J. A.

    2009-02-15

    The Monte Carlo (MC) method has been widely implemented in studies of radiation effects on human genetic material. Most of these works have used specific-purpose MC codes to simulate radiation transport in condensed media. PENELOPE is one of the general-purpose MC codes that has been used in many applications related to radiation dosimetry. Based on the fact that PENELOPE can carry out event-by-event coupled electron-photon transport simulations following these particles down to energies of the order of few tens of eV, we have decided to investigate the capacities of this code in the field of nanodosimetry. Single and double strand break probabilities due to the direct impact of {gamma} rays originated from Co{sup 60} and Cs{sup 137} isotopes and characteristic x-rays, from Al and C K-shells, have been determined by use of PENELOPE. Indirect damage has not been accounted for in this study. A human genetic material geometrical model has been developed, taking into account five organizational levels. In an article by Friedland et al. [Radiat. Environ. Biophys. 38, 39-47 (1999)], a specific-purpose MC code and a very sophisticated DNA geometrical model were used. We have chosen that work as a reference to compare our results. Single and double strand-break probabilities obtained here underestimate those reported by Friedland and co-workers by 20%-76% and 50%-60%, respectively. However, we obtain RBE values for Cs{sup 137}, Al{sub K} and C{sub K} radiations in agreement with those reported in previous works [Radiat. Environ. Biophys. 38, 39-47 (1999)] and [Phys. Med. Biol. 53, 233-244 (2008)]. Some enhancements can be incorporated into the PENELOPE code to improve its results in the nanodosimetry field.

  3. Two mitochondrial genomes from the families Bethylidae and Mutillidae: independent rearrangement of protein-coding genes and higher-level phylogeny of the Hymenoptera.

    PubMed

    Wei, Shu-Jun; Li, Qian; van Achterberg, Kees; Chen, Xue-Xin

    2014-08-01

    In animal mitochondrial genomes, gene arrangements are usually conserved across major lineages but might be rearranged within derived groups, and might provide valuable phylogenetic characters. Here, we sequenced the mitochondrial genomes of Cephalonomia gallicola (Chrysidoidea: Bethylidae) and Wallacidia oculata (Vespoidea: Mutillidae). In Cephalonomia at least 11 tRNA and 2 protein-coding genes were rearranged, which is the first report of protein-coding gene rearrangements in the Aculeata. In the Hymenoptera, three types of protein-coding gene rearrangement events occur, i.e. reversal, transposition and reverse transposition. Venturia (Ichneumonidae) had the greatest number of common intervals with the ancestral gene arrangement pattern, whereas Philotrypesis (Agaonidae) had the fewest. The most similar rearrangement patterns are shared between Nasonia (Pteromalidae) and Philotrypesis, whereas the most differentiated rearrangements occur between Cotesia (Braconidae) and Philotrypesis. It is clear that protein-coding gene rearrangements in the Hymenoptera are evolutionarily independent across the major lineages but are conserved within groups such as the Chalcidoidea. Phylogenetic analyses supported the sister-group relationship of Orrussoidea and Apocrita, Ichneumonoidea and Aculeata, Vespidae and Apoidea, and the paraphyly of Vespoidea. The Evaniomorpha and phylogenetic relationships within Aculeata remain controversial, with discrepancy between analyses using protein-coding and RNA genes.

  4. Investigation of in-band transmission of both spectral amplitude coding/optical code division multiple-access and wavelength division multiplexing signals

    NASA Astrophysics Data System (ADS)

    Ashour, Isaac A. M.; Shaari, Sahbudin; Shalaby, Hossam M. H.; Menon, P. Susthitha

    2011-06-01

    The transmission of both optical code division multiple-access (OCDMA) and wavelength division multiplexing (WDM) users on the same band is investigated. Code pulses of spectral amplitude coding (SAC)/optical code division multiple-access (CDMA) are overlaid onto a multichannel WDM system. Notch filters are utilized in order to suppress the WDM interference signals for detection of optical broadband CDMA signals. Modified quadratic congruence (MQC) codes are used as the signature codes for the SAC/OCDMA system. The proposed system is simulated and its performance in terms of both the bit-error rate and Q-factor are determined. In addition, eavesdropper probability of error-free code detection is evaluated. Our results are compared to traditional nonhybrid systems. It is concluded that the proposed hybrid scheme still achieves acceptable performance. In addition, it provides enhanced data confidentiality as compared to the scheme with SAC/OCDMA only. It is also shown that the performance of the proposed system is limited by the interference of the WDM signals. Furthermore, the simulation illustrates the tradeoff between the performance and confidentiality for authorized users.

  5. Your ticket to independence: a guide to getting your first principal investigator position.

    PubMed

    Káradóttir, Ragnhildur Thóra; Letzkus, Johannes J; Mameli, Manuel; Ribeiro, Carlos

    2015-10-01

    The transition to scientific independence as a principal investigator (PI) can seem like a daunting and mysterious process to postdocs and students - something that many aspire to while at the same time wondering how to achieve this goal and what being a PI really entails. The FENS Kavli Network of Excellence (FKNE) is a group of young faculty who have recently completed this step in various fields of neuroscience across Europe. In a series of opinion pieces from FKNE scholars, we aim to demystify this process and to offer the next generation of up-and-coming PIs some advice and personal perspectives on the transition to independence, starting here with guidance on how to get hired to your first PI position. Rather than providing an exhaustive overview of all facets of the hiring process, we focus on a few key aspects that we have learned to appreciate in the quest for our own labs: What makes a research programme exciting and successful? How can you identify great places to apply to and make sure your application stands out? What are the key objectives for the job talk and the interview? How do you negotiate your position? And finally, how do you decide on a host institute that lets you develop both scientifically and personally in your new role as head of a lab?

  6. Semantic association investigated with fMRI and independent component analysis

    PubMed Central

    Kim, Kwang Ki; Karunanayaka, Prasanna; Privitera, Michael D.; Holland, Scott K.; Szaflarski, Jerzy P.

    2010-01-01

    Semantic association, an essential element of human language, enables discourse and inference. Neuroimaging studies have revealed localization and lateralization of semantic circuitry making substantial contributions to cognitive neuroscience. However, due to methodological limitations, these investigations have only identified individual functional components rather than capturing the behavior of the entire network. To overcome these limitations, we have implemented group independent component analysis (ICA) to investigate the cognitive modules used by healthy adults performing fMRI semantic decision task. When compared to the results of a standard GLM analysis, ICA detected several additional brain regions subserving semantic decision. Eight task-related group ICA maps were identified including left inferior frontal gyrus (BA44/45), middle posterior temporal gyrus (BA39/22), angular gyrus/inferior parietal lobule (BA39/40), posterior cingulate (BA30), bilateral lingual gyrus (BA18/23), inferior frontal gyrus (L>R, BA47), hippocampus with parahippocampal gyrus (L>R, BA35/36) and anterior cingulate (BA32/24). While most of the components were represented bilaterally, we found a single, highly left-lateralized component that included the inferior frontal gyrus and the medial and superior temporal gyri, the angular and supramarginal gyri and the inferior parietal cortex. The presence of these spatially independent ICA components implies functional connectivity and can be equated with their modularity. These results are analyzed and presented in the framework of a biologically plausible theoretical model in preparation for similar analyses in patients with right- or left-hemispheric epilepsies. PMID:21296027

  7. Independent evaluation of conflicting microspherule results from different investigations of the Younger Dryas impact hypothesis.

    PubMed

    LeCompte, Malcolm A; Goodyear, Albert C; Demitroff, Mark N; Batchelor, Dale; Vogel, Edward K; Mooney, Charles; Rock, Barrett N; Seidel, Alfred W

    2012-10-30

    Firestone et al. sampled sedimentary sequences at many sites across North America, Europe, and Asia [Firestone RB, et al. (2007) Proc Natl Acad Sci USA 106:16016-16021]. In sediments dated to the Younger Dryas onset or Boundary (YDB) approximately 12,900 calendar years ago, Firestone et al. reported discovery of markers, including nanodiamonds, aciniform soot, high-temperature melt-glass, and magnetic microspherules attributed to cosmic impacts/airbursts. The microspherules were explained as either cosmic material ablation or terrestrial ejecta from a hypothesized North American impact that initiated the abrupt Younger Dryas cooling, contributed to megafaunal extinctions, and triggered human cultural shifts and population declines. A number of independent groups have confirmed the presence of YDB spherules, but two have not. One of them [Surovell TA, et al. (2009) Proc Natl Acad Sci USA 104:18155-18158] collected and analyzed samples from seven YDB sites, purportedly using the same protocol as Firestone et al., but did not find a single spherule in YDB sediments at two previously reported sites. To examine this discrepancy, we conducted an independent blind investigation of two sites common to both studies, and a third site investigated only by Surovell et al. We found abundant YDB microspherules at all three widely separated sites consistent with the results of Firestone et al. and conclude that the analytical protocol employed by Surovell et al. deviated significantly from that of Firestone et al. Morphological and geochemical analyses of YDB spherules suggest they are not cosmic, volcanic, authigenic, or anthropogenic in origin. Instead, they appear to have formed from abrupt melting and quenching of terrestrial materials.

  8. INVESTIGATING DIFFERENCES IN BRAIN FUNCTIONAL NETWORKS USING HIERARCHICAL COVARIATE-ADJUSTED INDEPENDENT COMPONENT ANALYSIS

    PubMed Central

    Shi, Ran

    2016-01-01

    Human brains perform tasks via complex functional networks consisting of separated brain regions. A popular approach to characterize brain functional networks in fMRI studies is independent component analysis (ICA), which is a powerful method to reconstruct latent source signals from their linear mixtures. In many fMRI studies, an important goal is to investigate how brain functional networks change according to specific clinical and demographic variabilities. Existing ICA methods, however, cannot directly incorporate covariate effects in ICA decomposition. Heuristic post-ICA analysis to address this need can be inaccurate and inefficient. In this paper, we propose a hierarchical covariate-adjusted ICA (hc-ICA) model that provides a formal statistical framework for estimating covariate effects and testing differences between brain functional networks. Our method provides a more reliable and powerful statistical tool for evaluating group differences in brain functional networks while appropriately controlling for potential confounding factors. We present an analytically tractable EM algorithm to obtain maximum likelihood estimates of our model. We also develop a subspace-based approximate EM that runs significantly faster while retaining high accuracy. To test the differences in functional networks, we introduce a voxel-wise approximate inference procedure which eliminates the need of computationally expensive covariance matrix estimation and inversion. We demonstrate the advantages of our methods over the existing method via simulation studies. We apply our method to an fMRI study to investigate differences in brain functional networks associated with post-traumatic stress disorder (PTSD).

  9. ALS beamlines for independent investigators: A summary of the capabilities and characteristics of beamlines at the ALS

    SciTech Connect

    Not Available

    1992-08-01

    There are two mods of conducting research at the ALS: To work as a member of a participating research team (PRT). To work as a member of a participating research team (PRT); to work as an independent investigator; PRTs are responsible for building beamlines, end stations, and, in some cases, insertion devices. Thus, PRT members have privileged access to the ALS. Independent investigators will use beamline facilities made available by PRTs. The purpose of this handbook is to describe these facilities.

  10. Flight Investigation of Prescribed Simultaneous Independent Surface Excitations for Real-Time Parameter Identification

    NASA Technical Reports Server (NTRS)

    Moes, Timothy R.; Smith, Mark S.; Morelli, Eugene A.

    2003-01-01

    Near real-time stability and control derivative extraction is required to support flight demonstration of Intelligent Flight Control System (IFCS) concepts being developed by NASA, academia, and industry. Traditionally, flight maneuvers would be designed and flown to obtain stability and control derivative estimates using a postflight analysis technique. The goal of the IFCS concept is to be able to modify the control laws in real time for an aircraft that has been damaged in flight. In some IFCS implementations, real-time parameter identification (PID) of the stability and control derivatives of the damaged aircraft is necessary for successfully reconfiguring the control system. This report investigates the usefulness of Prescribed Simultaneous Independent Surface Excitations (PreSISE) to provide data for rapidly obtaining estimates of the stability and control derivatives. Flight test data were analyzed using both equation-error and output-error PID techniques. The equation-error PID technique is known as Fourier Transform Regression (FTR) and is a frequency-domain real-time implementation. Selected results were compared with a time-domain output-error technique. The real-time equation-error technique combined with the PreSISE maneuvers provided excellent derivative estimation in the longitudinal axis. However, the PreSISE maneuvers as presently defined were not adequate for accurate estimation of the lateral-directional derivatives.

  11. Coding variants at hexa-allelic amino acid 13 of HLA-DRB1 explain independent SNP associations with follicular lymphoma risk.

    PubMed

    Foo, Jia Nee; Smedby, Karin E; Akers, Nicholas K; Berglund, Mattias; Irwan, Ishak D; Jia, Xiaoming; Li, Yi; Conde, Lucia; Darabi, Hatef; Bracci, Paige M; Melbye, Mads; Adami, Hans-Olov; Glimelius, Bengt; Khor, Chiea Chuen; Hjalgrim, Henrik; Padyukov, Leonid; Humphreys, Keith; Enblad, Gunilla; Skibola, Christine F; de Bakker, Paul I W; Liu, Jianjun

    2013-07-11

    Non-Hodgkin lymphoma represents a diverse group of blood malignancies, of which follicular lymphoma (FL) is a common subtype. Previous genome-wide association studies (GWASs) have identified in the human leukocyte antigen (HLA) class II region multiple independent SNPs that are significantly associated with FL risk. To dissect these signals and determine whether coding variants in HLA genes are responsible for the associations, we conducted imputation, HLA typing, and sequencing in three independent populations for a total of 689 cases and 2,446 controls. We identified a hexa-allelic amino acid polymorphism at position 13 of the HLA-DR beta chain that showed the strongest association with FL within the major histocompatibility complex (MHC) region (multiallelic p = 2.3 × 10⁻¹⁵). Out of six possible amino acids that occurred at that position within the population, we classified two as high risk (Tyr and Phe), two as low risk (Ser and Arg), and two as moderate risk (His and Gly). There was a 4.2-fold difference in risk (95% confidence interval = 2.9-6.1) between subjects carrying two alleles encoding high-risk amino acids and those carrying two alleles encoding low-risk amino acids (p = 1.01 × 10⁻¹⁴). This coding variant might explain the complex SNP associations identified by GWASs and suggests a common HLA-DR antigen-driven mechanism for the pathogenesis of FL and rheumatoid arthritis.

  12. Detailed investigation of Long-Period activity at Campi Flegrei by Convolutive Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Capuano, P.; De Lauro, E.; De Martino, S.; Falanga, M.

    2016-04-01

    This work is devoted to the analysis of seismic signals continuously recorded at Campi Flegrei Caldera (Italy) during the entire year 2006. The radiation pattern associated with the Long-Period energy release is investigated. We adopt an innovative Independent Component Analysis algorithm for convolutive seismic series adapted and improved to give automatic procedures for detecting seismic events often buried in the high-level ambient noise. The extracted waveforms characterized by an improved signal-to-noise ratio allows the recognition of Long-Period precursors, evidencing that the seismic activity accompanying the mini-uplift crisis (in 2006), which climaxed in the three days from 26-28 October, had already started at the beginning of the month of October and lasted until mid of November. Hence, a more complete seismic catalog is then provided which can be used to properly quantify the seismic energy release. To better ground our results, we first check the robustness of the method by comparing it with other blind source separation methods based on higher order statistics; secondly, we reconstruct the radiation patterns of the extracted Long-Period events in order to link the individuated signals directly to the sources. We take advantage from Convolutive Independent Component Analysis that provides basic signals along the three directions of motion so that a direct polarization analysis can be performed with no other filtering procedures. We show that the extracted signals are mainly composed of P waves with radial polarization pointing to the seismic source of the main LP swarm, i.e. a small area in the Solfatara, also in the case of the small-events, that both precede and follow the main activity. From a dynamical point of view, they can be described by two degrees of freedom, indicating a low-level of complexity associated with the vibrations from a superficial hydrothermal system. Our results allow us to move towards a full description of the complexity of

  13. Establishing evidence of contact transfer in criminal investigation by a novel 'peptide coding' reagent.

    PubMed

    Gooch, James; Koh, Clarissa; Daniel, Barbara; Abbate, Vincenzo; Frascione, Nunzianda

    2015-11-01

    Forensic investigators are often faced with the challenge of forming a logical association between a suspect, object or location and a particular crime. This article documents the development of a novel reagent that may be used to establish evidence of physical contact between items and individuals as a result of criminal activity. Consisting of a fluorescent compound suspended within an oil-based medium, this reagent utilises the addition of short customisable peptide molecules of a specific known sequence as unique owner-registered 'codes'. This product may be applied onto goods or premises of criminal interest and subsequently transferred onto objects that contact target surfaces. Visualisation of the reagent is then achieved via fluorophore excitation, subsequently allowing rapid peptide recovery and analysis. Simple liquid-liquid extraction methods were devised to rapidly isolate the peptide from other reagent components prior to analysis by ESI-MS.

  14. Investigating the use of quick response codes in the gross anatomy laboratory.

    PubMed

    Traser, Courtney J; Hoffman, Leslie A; Seifert, Mark F; Wilson, Adam B

    2015-01-01

    The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student performance, and evaluated whether performance could be explained by the frequency of QR code usage. Question prompts and QR codes tagged on cadaveric specimens and models were available for four weeks as learning aids to medical (n = 155) and doctor of physical therapy (n = 39) students. Each QR code provided answers to posed questions in the form of embedded text or hyperlinked web pages. Students' perceptions were gathered using a formative questionnaire and practical examination scores were used to assess potential gains in student achievement. Overall, students responded positively to the use of QR codes in the gross anatomy laboratory as 89% (57/64) agreed the codes augmented their learning of anatomy. The users' most noticeable objection to using QR codes was the reluctance to bring their smartphones into the gross anatomy laboratory. A comparison between the performance of QR code users and non-users was found to be nonsignificant (P = 0.113), and no significant gains in performance (P = 0.302) were observed after the intervention. Learners welcomed the implementation of QR code technology in the gross anatomy laboratory, yet this intervention had no apparent effect on practical examination performance.

  15. Investigating the Use of Quick Response Codes in the Gross Anatomy Laboratory

    ERIC Educational Resources Information Center

    Traser, Courtney J.; Hoffman, Leslie A.; Seifert, Mark F.; Wilson, Adam B.

    2015-01-01

    The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student…

  16. Investigation of Cool and Hot Executive Function in ODD/CD Independently of ADHD

    ERIC Educational Resources Information Center

    Hobson, Christopher W.; Scott, Stephen; Rubia, Katya

    2011-01-01

    Background: Children with oppositional defiant disorder/conduct disorder (ODD/CD) have shown deficits in "cool" abstract-cognitive, and "hot" reward-related executive function (EF) tasks. However, it is currently unclear to what extent ODD/CD is associated with neuropsychological deficits, independently of attention deficit hyperactivity disorder…

  17. Preliminary numerical investigation of bandwidth effects on CBET using the LPSE-CBET code

    NASA Astrophysics Data System (ADS)

    Bates, Jason; Myatt, Jason; Shaw, John; Weaver, James; Obenschain, Keith; Lehmberg, Robert; Obenschain, Steve

    2016-10-01

    Cross beam energy transfer (CBET) is a significant energy-loss mechanism for direct-drive implosions on the OMEGA laser facility. Recently, a working group that includes participants from the Laboratory for Laser Energetics (LLE) at the University of Rochester and the U.S. Naval Research Laboratory (NRL) was formed to investigate strategies for ameliorating the deleterious effects of CBET. As part of this collaboration, the wave-based code LPSE-CBET developed at LLE has been made available to researchers at NRL and is being used to study the feasibility of suppressing CBET through the enhancement of laser bandwidth by stimulated rotational Raman scattering (SRRS). In this poster, we present some preliminary results on this subject. In particular, we discuss initial efforts to evaluate mitigation levels of 4 discrete Stokes lines from SRRS in air and compare our findings with ray-based simulation results of wavelength shifted (-6Å ,0, +6Å) driver-lines on OMEGA. Work Supported by DoE/NNSA.

  18. Investigation of Inconsistent ENDF/B-VII.1 Independent and Cumulative Fission Product Yields with Proposed Revisions

    SciTech Connect

    Pigni, M.T. Francis, M.W.; Gauld, I.C.

    2015-01-15

    A recent implementation of ENDF/B-VII.1 independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear schemes in the decay sub-library that are not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that do not agree with the cumulative fission yields in the library as well as with experimental measurements. To address these issues, a comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron-induced fission for {sup 235,238}U and {sup 239,241}Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to compare the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. Another important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library for stable and long-lived fission products. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.

  19. Investigation of inconsistent ENDF/B-VII.1 independent and cumulative fission product yields with proposed revisions

    SciTech Connect

    Pigni, Marco T; Francis, Matthew W; Gauld, Ian C

    2015-01-01

    A recent implementation of ENDF/B-VII. independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear scheme in the decay sub-library that is not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that are incompatible with the cumulative fission yields in the library, and also with experimental measurements. A comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron induced fission for 235,238U and 239,241Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to evaluate the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. An important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library in the case of stable and long-lived cumulative yields due to the inconsistency of ENDF/B-VII.1 fission p;roduct yield and decay data sub-libraries. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.

  20. Investigation of Inconsistent ENDF/B-VII.1 Independent and Cumulative Fission Product Yields with Proposed Revisions

    NASA Astrophysics Data System (ADS)

    Pigni, M. T.; Francis, M. W.; Gauld, I. C.

    2015-01-01

    A recent implementation of ENDF/B-VII.1 independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear schemes in the decay sub-library that are not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that do not agree with the cumulative fission yields in the library as well as with experimental measurements. To address these issues, a comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron-induced fission for 235,238U and 239,241Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to compare the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. Another important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library for stable and long-lived fission products. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.

  1. AN INVESTIGATION OF NON-INDEPENDENCE OF COMPONENTS OF SCORES ON MULTIPLE-CHOICE TESTS. FINAL REPORT.

    ERIC Educational Resources Information Center

    ZIMMERMAN, DONALD W.; BURKHEIMER, GRAHAM J., JR.

    INVESTIGATION IS CONTINUED INTO VARIOUS EFFECTS OF NON-INDEPENDENT ERROR INTRODUCED INTO MULTIPLE-CHOICE TEST SCORES AS A RESULT OF CHANCE GUESSING SUCCESS. A MODEL IS DEVELOPED IN WHICH THE CONCEPT OF THEORETICAL COMPONENTS OF SCORES IS NOT INTRODUCED AND IN WHICH, THEREFORE, NO ASSUMPTIONS REGARDING ANY RELATIONSHIP BETWEEN SUCH COMPONENTS NEED…

  2. Investigation of Long Non-coding RNA Expression Profiles in the Substantia Nigra of Parkinson's Disease.

    PubMed

    Ni, Yaohui; Huang, Hua; Chen, Yaqin; Cao, Maohong; Zhou, Hongzhi; Zhang, Yuanyuan

    2017-03-01

    Genetics is considered as an important risk factor in the pathological changes of Parkinson's disease (PD). Substantia nigra (SN) is thought to be the most vulnerable area in this process. In recent decades, however, few related long non-coding RNAs (lncRNAs) in the SN of PD patients had been identified and the functions of those lncRNAs had been studied even less. In this study, we sought to investigate the lncRNA expression profiles and their potential functions in the SN of PD patients. We screened lncRNA expression profiles in the SN of PD patients using the lncRNA mining approach from the ArrayExpress database, which included GSE20295. The samples were from 11 of PD and 14 of normal tissue samples. We identified 87 lncRNAs that were altered significantly in the SN during the occurrence of PD. Among these lncRNAs, lncRNA AL049437 and lncRNA AK021630 varied most dramatically. AL049437 was up-regulated in the PD samples, while AK021630 was down-regulated. Based on the results, we focused on the potential roles of the two lncRNAs in the pathogenesis of PD by the knockdown of the expression of AL049437 or AK021630 in human neuroblastoma SH-SY5Y cell line. Results indicated that the reduction in AL049437 level increased cell viability, mitochondrial transmembrane potential (Δψm), mitochondrial mass, and tyrosine hydroxylase (TyrH) secretion. By contrast, the knockdown of AK021630 resulted in the opposite effect. Based on these results, we speculated that lncRNA AL049437 likely contributed to the risk of PD, while lncRNA AK021630 likely inhibited the occurrence of PD.

  3. A Parallel Independent Component Analysis Approach to Investigate Genomic Influence on Brain Function

    PubMed Central

    Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D.

    2009-01-01

    Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings. PMID:19834575

  4. Role asymmetry and code transmission in signaling games: an experimental and computational investigation.

    PubMed

    Moreno, Maggie; Baggio, Giosuè

    2015-07-01

    In signaling games, a sender has private access to a state of affairs and uses a signal to inform a receiver about that state. If no common association of signals and states is initially available, sender and receiver must coordinate to develop one. How do players divide coordination labor? We show experimentally that, if players switch roles at each communication round, coordination labor is shared. However, in games with fixed roles, coordination labor is divided: Receivers adjust their mappings more frequently, whereas senders maintain the initial code, which is transmitted to receivers and becomes the common code. In a series of computer simulations, player and role asymmetry as observed experimentally were accounted for by a model in which the receiver in the first signaling round has a higher chance of adjusting its code than its partner. From this basic division of labor among players, certain properties of role asymmetry, in particular correlations with game complexity, are seen to follow.

  5. Write to Read: Investigating the Reading-Writing Relationship of Code-Level Early Literacy Skills

    ERIC Educational Resources Information Center

    Jones, Cindy D.; Reutzel, D. Ray

    2015-01-01

    The purpose of this study was to examine whether the code-related features used in current methods of writing instruction in kindergarten classrooms transfer reading outcomes for kindergarten students. We randomly assigned kindergarten students to 3 instructional groups: a writing workshop group, an interactive writing group, and a control group.…

  6. Investigating the semantic interoperability of laboratory data exchanged using LOINC codes in three large institutions.

    PubMed

    Lin, Ming-Chin; Vreeman, Daniel J; Huff, Stanley M

    2011-01-01

    LOINC codes are seeing increased use in many organizations. In this study, we examined the barriers to semantic interoperability that still exist in electronic data exchange of laboratory results even when LOINC codes are being used as the observation identifiers. We analyzed semantic interoperability of laboratory data exchanged using LOINC codes in three large institutions. To simplify the analytic process, we divided the laboratory data into quantitative and non-quantitative tests. The analysis revealed many inconsistencies even when LOINC codes are used to exchange laboratory data. For quantitative tests, the most frequent problems were inconsistencies in the use of units of measure: variations in the strings used to represent units (unrecognized synonyms), use of units that result in different magnitudes of the numeric quantity, and missing units of measure. For non-quantitative tests, the most frequent problems were acronyms/synonyms, different classes of elements in enumerated lists, and the use of free text. Our findings highlight the limitations of interoperability in current laboratory reporting.

  7. Culture-Dependent and -Independent Methods To Investigate the Microbial Ecology of Italian Fermented Sausages

    PubMed Central

    Rantsiou, Kalliopi; Urso, Rosalinda; Iacumin, Lucilla; Cantoni, Carlo; Cattaneo, Patrizia; Comi, Giuseppe; Cocolin, Luca

    2005-01-01

    In this study, the microbial ecology of three naturally fermented sausages produced in northeast Italy was studied by culture-dependent and -independent methods. By plating analysis, the predominance of lactic acid bacteria populations was pointed out, as well as the importance of coagulase-negative cocci. Also in the case of one fermentation, the fecal enterocci reached significant counts, highlighting their contribution to the particular transformation process. Yeast counts were higher than the detection limit (>100 CFU/g) in only one fermented sausage. Analysis of the denaturing gradient gel electrophoresis (DGGE) patterns and sequencing of the bands allowed profiling of the microbial populations present in the sausages during fermentation. The bacterial ecology was mainly characterized by the stable presence of Lactobacillus curvatus and Lactobacillus sakei, but Lactobacillus paracasei was also repeatedly detected. An important piece of evidence was the presence of Lactococcus garvieae, which clearly contributed in two fermentations. Several species of Staphylococcus were also detected. Regarding other bacterial groups, Bacillus sp., Ruminococcus sp., and Macrococcus caseolyticus were also identified at the beginning of the transformations. In addition, yeast species belonging to Debaryomyces hansenii, several Candida species, and Willopsis saturnus were observed in the DGGE gels. Finally, cluster analysis of the bacterial and yeast DGGE profiles highlighted the uniqueness of the fermentation processes studied. PMID:15812029

  8. Nye County nuclear waste repository project office independent scientific investigations program. Summary annual report, May 1996--April 1997

    SciTech Connect

    1997-05-01

    This annual summary report, prepared by Multimedia Environmental Technology, Inc. (MET) on behalf of Nye County Nuclear Waste Project Office, summarizes the activities that were performed during the period from May 1, 1996 to April 30, 1997. These activities were conducted in support of the Independent Scientific Investigation Program (ISIP) of Nye County at the Yucca Mountain Site (YMS). The Nye County NWRPO is responsible for protecting the health and safety of the Nye County residents. NWRPO`s on-site representative is responsible for designing and implementing the Independent Scientific Investigation Program (ISIP). Major objectives of the ISIP include: (1) Investigating key issues related to conceptual design and performance of the repository that can have major impact on human health, safety, and the environment. (2) Identifying areas not being addressed adequately by DOE Nye County has identified several key scientific issues of concern that may affect repository design and performance which were not being adequately addressed by DOE. Nye County has been conducting its own independent study to evaluate the significance of these issues.

  9. Nye County Nuclear Waste Repository Project Office independent scientific investigations program annual report, May 1997--April 1998

    SciTech Connect

    1998-07-01

    This annual summary report, prepared by the Nye County Nuclear Waste Repository Project Office (NWRPO), summarizes the activities that were performed during the period from May 1, 1997 to April 30, 1998. These activities were conducted in support of the Independent Scientific Investigation Program (ISIP) of Nye County at the Yucca Mountain Site (YMS). The Nye County NWRPO is responsible for protecting the health and safety of the Nye County residents. NWRPO`s on-site representative is responsible for designing and implementing the Independent Scientific Investigation Program (ISIP). Major objectives of the ISIP include: Investigating key issues related to conceptual design and performance of the repository that can have major impact on human health, safety, and the environment; identifying areas not being addressed adequately by the Department of Energy (DOE). Nye County has identified several key scientific issues of concern that may affect repository design and performance which were not being adequately addressed by DOE. Nye County has been conducting its own independent study to evaluate the significance of these issues. This report summarizes the results of monitoring from two boreholes and the Exploratory Studies Facility (ESF) tunnel that have been instrumented by Nye County since March and April of 1995. The preliminary data and interpretations presented in this report do not constitute and should not be considered as the official position of Nye County. The ISIP presently includes borehole and tunnel instrumentation, monitoring, data analysis, and numerical modeling activities to address the concerns of Nye County.

  10. A Monte Carlo Investigation of the Analysis of Variance Applied to Non-Independent Bernoulli Variates.

    ERIC Educational Resources Information Center

    Draper, John F., Jr.

    The applicability of the Analysis of Variance, ANOVA, procedures to the analysis of dichotomous repeated measure data is described. The design models for which data were simulated in this investigation were chosen to represent simple cases of two experimental situations: situation one, in which subjects' responses to a single randomly selected set…

  11. An Investigation of Two Acoustic Propagation Codes for Three-Dimensional Geometries

    NASA Technical Reports Server (NTRS)

    Nark, D. M.; Watson, W. R.; Jones, M. G.

    2005-01-01

    The ability to predict fan noise within complex three-dimensional aircraft engine nacelle geometries is a valuable tool in studying low-noise designs. Recent years have seen the development of aeroacoustic propagation codes using various levels of approximation to obtain such a capability. In light of this, it is beneficial to pursue a design paradigm that incorporates the strengths of the various tools. The development of a quasi-3D methodology (Q3D-FEM) at NASA Langley has brought these ideas to mind in relation to the framework of the CDUCT-LaRC acoustic propagation and radiation tool. As more extensive three dimensional codes become available, it would seem appropriate to incorporate these tools into a framework similar to CDUCT-LaRC and use them in a complementary manner. This work focuses on such an approach in beginning the steps toward a systematic assessment of the errors, and hence the trade-offs, involved in the use of these codes. To illustrate this point, CDUCT-LaRC was used to study benchmark hardwall duct problems to quantify errors caused by wave propagation in directions far removed from that defined by the parabolic approximation. Configurations incorporating acoustic treatment were also studied with CDUCT-LaRC and Q3D-FEM. The cases presented show that acoustic treatment diminishes the effects of CDUCT-LaRC phase error as the solutions are attenuated. The results of the Q3D-FEM were very promising and matched the analytic solution very well. Overall, these tests were meant to serve as a step toward the systematic study of errors inherent in the propagation module of CDUCT-LaRC, as well as an initial test of the higher fidelity Q3D-FEM code.

  12. Magnetic reconnection in multispecies plasmas investigated by a kinetic fluid code

    NASA Astrophysics Data System (ADS)

    Dong, Chuanfei; Wang, Liang; Bhattacharjee, Amitava; Hakim, Ammar; Huang, Yi-Min; Germaschewski, Kai

    2016-10-01

    We first study the reconnection process in multispecies plasmas by using Gkeyll, which is a kinetic fluid code solving the continuity, momentum and energy equations of each species, and the full Maxwell equations. Thus, there is no assumption by solving the generalized ohm's law in Gkeyll. We studied the reconnection processes in the plasma consisting of electrons, protons and oxygen ions. If time allows, we also plan to show some preliminary results of magnetic reconnection in dusty plasmas with negatively charged dust.

  13. Clinical investigation of TROP-2 as an independent biomarker and potential therapeutic target in colon cancer.

    PubMed

    Zhao, Peng; Yu, Hai-Zheng; Cai, Jian-Hui

    2015-09-01

    Colon cancer is associated with a severe demographic and economic burden worldwide. The pathogenesis of colon cancer is highly complex and involves sequential genetic and epigenetic mechanisms. Despite extensive investigation, the pathogenesis of colon cancer remains to be elucidated. As the third most common type of cancer worldwide, the treatment options for colon cancer are currently limited. Human trophoblast cell‑surface marker (TROP‑2), is a cell‑surface transmembrane glycoprotein overexpressed by several types of epithelial carcinoma. In addition, TROP‑2 has been demonstrated to be associated with tumorigenesis and invasiveness in solid types of tumor. The aim of the present study was to investigate the protein expression of TROP‑2 in colon cancer tissues, and further explore the association between the expression of TROP‑2 and clinicopathological features of patients with colon cancer. The expression and localization of the TROP‑2 protein was examined using western blot analysis and immunofluorescence staining. Finally, the expression of TROP‑2 expression was correlated to conventional clinicopathological features of colon cancer using a χ2 test. The results revealed that TROP‑2 protein was expressed at high levels in the colon cancer tissues, which was associated with the development and pathological process of colon cancer. Therefore, TROP‑2 may be used as a biomarker to determine the clinical prognosis, and as a potential therapeutic target in colon cancer.

  14. Investigation of low temperature solid oxide fuel cells for air-independent UUV applications

    NASA Astrophysics Data System (ADS)

    Moton, Jennie Mariko

    Unmanned underwater vehicles (UUVs) will benefit greatly from high energy density (> 500 Wh/L) power systems utilizing high-energy-density fuels and air-independent oxidizers. Current battery-based systems have limited energy densities (< 400 Wh/L), which motivate development of alternative power systems such as solid oxide fuel cells (SOFCs). SOFC-based power systems have the potential to achieve the required UUV energy densities, and the current study explores how SOFCs based on gadolinia-doped ceria (GDC) electrolytes with operating temperatures of 650°C and lower may operate in the unique environments of a promising UUV power plant. The plant would contain a H 2O2 decomposition reactor to supply humidified O2 to the SOFC cathode and exothermic aluminum/H2O combustor to provide heated humidified H2 fuel to the anode. To characterize low-temperature SOFC performance with these unique O2 and H2 source, SOFC button cells based on nickel/GDC (Gd0.1Ce0.9O 1.95) anodes, GDC electrolytes, and lanthanum strontium cobalt ferrite (La0.6Sr0.4Co0.2Fe0.8O3-δ or LSCF)/GDC cathodes were fabricated and tested for performance and stability with humidity on both the anode and the cathode. Cells were also tested with various reactant concentrations of H2 and O2 to simulate gas depletion down the channel of an SOFC stack. Results showed that anode performance depended primarily on fuel concentration and less on the concentration of the associated increase in product H2O. O 2 depletion with humidified cathode flows also caused significant loss in cell current density at a given voltage. With the humidified flows in either the anode or cathode, stability tests of the button cells at 650 °C showed stable voltage is maintained at low operating current (0.17 A/cm2) at up to 50 % by mole H2O, but at higher current densities (0.34 A/cm2), irreversible voltage degradation occurred at rates of 0.8-3.7 mV/hour depending on exposure time. From these button cell results, estimated average

  15. Investigation of the Fission Product Release From Molten Pools Under Oxidizing Conditions With the Code RELOS

    SciTech Connect

    Kleinhietpass, Ingo D.; Unger, Hermann; Wagner, Hermann-Josef; Koch, Marco K.

    2006-07-01

    With the purpose of modeling and calculating the core behavior during severe accidents in nuclear power plants system codes are under development worldwide. Modeling of radionuclide release and transport in the case of beyond design basis accidents is an integrated feature of the deterministic safety analysis of nuclear power plants. Following a hypothetical, uncontrolled temperature escalation in the core of light water reactors, significant parts of the core structures may degrade and melt down under formation of molten pools, leading to an accumulation of large amounts of radioactive materials. The possible release of radionuclides from the molten pool provides a potential contribution to the aerosol source term in the late phase of core degradation accidents. The relevance of the amount of transferred oxygen from the gas atmosphere into the molten pool on the specification of a radionuclide and its release depends strongly on the initial oxygen inventory. Particularly for a low oxygen potential in the melt as it is the case for stratification when a metallic phase forms the upper layer and, respectively, when the oxidation has proceeded so far so that zirconium was completely oxidized, a significant influence of atmospheric oxygen on the specification and the release of some radionuclides has to be anticipated. The code RELOS (Release of Low Volatile Fission Products from Molten Surfaces) is under development at the Department of Energy Systems and Energy Economics (formerly Department of Nuclear and New Energy Systems) of the Ruhr-University Bochum. It is based on a mechanistic model to describe the diffusive and convective transport of fission products from the surface of a molten pool into a cooler gas atmosphere. This paper presents the code RELOS, i. e. the features and abilities of the latest code version V2.3 and the new model improvements of V2.4 and the calculated results evaluating the implemented models which deal with the oxygen transfer from the

  16. Safety Related Investigations of the VVER-1000 Reactor Type by the Coupled Code System TRACE/PARCS

    NASA Astrophysics Data System (ADS)

    Jaeger, Wadim; Espinoza, Victor Hugo Sánchez; Lischke, Wolfgang

    This study was performed at the Institute of Reactor Safety at the Forschungszentrum Karlsruhe. It is embedded in the ongoing investigations of the international code assessment and maintenance program (CAMP) for qualification and validation of system codes like TRACE(1) and PARCS(2). The chosen reactor type used to validate these two codes was the Russian designed VVER-1000 because the OECD/NEA VVER-1000 Coolant Transient Benchmark Phase 2(3) includes detailed information of the Bulgarian nuclear power plant (NPP) Kozloduy unit 6. The post-test investigations of a coolant mixing experiment have shown that the predicted parameters (coolant temperature, pressure drop, etc.) are in good agreement with the measured data. The coolant mixing pattern, especially in the downcomer, has been also reproduced quiet well by TRACE. The coupled code system TRACE/PARCS which was applied on a postulated main steam line break (MSLB) provided good results compared to reference values and the ones of other participants of the benchmark. The results show that the developed three-dimensional nodalization of the reactor pressure vessel (RPV) is appropriate to describe the coolant mixing phenomena in the downcomer and the lower plenum of a VVER-1000 reactor. This phenomenon is a key issue for investigations of MSLB transient where the thermal hydraulics and the core neutronics are strongly linked. The simulation of the RPV and core behavior for postulated transients using the validated 3D TRACE RPV model, taking into account boundary conditions at vessel in- and outlet, indicates that the results are physically sound and in good agreement to other participant's results.

  17. Training camp: The quest to become a new National Institutes of Health (NIH)-funded independent investigator

    NASA Astrophysics Data System (ADS)

    Sklare, Daniel A.

    2003-04-01

    This presentation will provide information on the research training and career development programs of the National Institute on Deafness and Other Communication Disorders (NIDCD). The predoctoral and postdoctoral fellowship (F30, F31, F32) programs and the research career development awards for clinically trained individuals (K08/K23) and for individuals trained in the quantitative sciences and in engineering (K25) will be highlighted. In addition, the role of the NIDCD Small Grant (R03) in transitioning postdoctoral-level investigators to research independence will be underscored.

  18. Flight investigation of cockpit-displayed traffic information utilizing coded symbology in an advanced operational environment

    NASA Technical Reports Server (NTRS)

    Abbott, T. S.; Moen, G. C.; Person, L. H., Jr.; Keyser, G. L., Jr.; Yenni, K. R.; Garren, J. F., Jr.

    1980-01-01

    Traffic symbology was encoded to provide additional information concerning the traffic, which was displayed on the pilot's electronic horizontal situation indicators (EHSI). A research airplane representing an advanced operational environment was used to assess the benefit of coded traffic symbology in a realistic work-load environment. Traffic scenarios, involving both conflict-free and conflict situations, were employed. Subjective pilot commentary was obtained through the use of a questionnaire and extensive pilot debriefings. These results grouped conveniently under two categories: display factors and task performance. A major item under the display factor category was the problem of display clutter. The primary contributors to clutter were the use of large map-scale factors, the use of traffic data blocks, and the presentation of more than a few airplanes. In terms of task performance, the cockpit-displayed traffic information was found to provide excellent overall situation awareness. Additionally, mile separation prescribed during these tests.

  19. Dimensionality of ICA in resting-state fMRI investigated by feature optimized classification of independent components with SVM.

    PubMed

    Wang, Yanlu; Li, Tie-Qiang

    2015-01-01

    Different machine learning algorithms have recently been used for assisting automated classification of independent component analysis (ICA) results from resting-state fMRI data. The success of this approach relies on identification of artifact components and meaningful functional networks. A limiting factor of ICA is the uncertainty of the number of independent components (NIC). We aim to develop a framework based on support vector machines (SVM) and optimized feature-selection for automated classification of independent components (ICs) and use the framework to investigate the effects of input NIC on the ICA results. Seven different resting-state fMRI datasets were studied. 18 features were devised by mimicking the empirical criteria for manual evaluation. The five most significant (p < 0.01) features were identified by general linear modeling and used to generate a classification model for the framework. This feature-optimized classification of ICs with SVM (FOCIS) framework was used to classify both group and single subject ICA results. The classification results obtained using FOCIS and previously published FSL-FIX were compared against manually evaluated results. On average the false negative rate in identifying artifact contaminated ICs for FOCIS and FSL-FIX were 98.27 and 92.34%, respectively. The number of artifact and functional network components increased almost linearly with the input NIC. Through tracking, we demonstrate that incrementing NIC affects most ICs when NIC < 33, whereas only a few limited ICs are affected by direct splitting when NIC is incremented beyond NIC > 40. For a given IC, its changes with increasing NIC are individually specific irrespective whether the component is a potential resting-state functional network or an artifact component. Using FOCIS, we investigated experimentally the ICA dimensionality of resting-state fMRI datasets and found that the input NIC can critically affect the ICA results of resting-state fMRI data.

  20. Theoretical models and simulation codes to investigate bystander effects and cellular communication at low doses

    NASA Astrophysics Data System (ADS)

    Ballarini, F.; Alloni, D.; Facoetti, A.; Mairani, A.; Nano, R.; Ottolenghi, A.

    Astronauts in space are continuously exposed to low doses of ionizing radiation from Galactic Cosmic Rays During the last ten years the effects of low radiation doses have been widely re-discussed following a large number of observations on the so-called non targeted effects in particular bystander effects The latter consist of induction of cytogenetic damage in cells not directly traversed by radiation most likely as a response to molecular messengers released by directly irradiated cells Bystander effects which are observed both for lethal endpoints e g clonogenic inactivation and apoptosis and for non-lethal ones e g mutations and neoplastic transformation tend to show non-linear dose responses This might have significant consequences in terms of low-dose risk which is generally calculated on the basis of the Linear No Threshold hypothesis Although the mechanisms underlying bystander effects are still largely unknown it is now clear that two types of cellular communication i e via gap junctions and or release of molecular messengers into the extracellular environment play a fundamental role Theoretical models and simulation codes can be of help in elucidating such mechanisms In the present paper we will review different available modelling approaches including one that is being developed at the University of Pavia The focus will be on the different assumptions adopted by the various authors and on the implications of such assumptions in terms of non-targeted radiobiological damage and more generally low-dose

  1. Preliminary investigations on 3D PIC simulation of DPHC structure using NEPTUNE3D code

    NASA Astrophysics Data System (ADS)

    Zhao, Hailong; Dong, Ye; Zhou, Haijing; Zou, Wenkang; Wang, Qiang

    2016-10-01

    Cubic region (34cm × 34cm × 18cm) including the double post-hole convolute (DPHC) structure was chosen to perform a series of fully 3D PIC simulations using NEPTUNE3D codes, massive data ( 200GB) could be acquired and solved in less than 5 hours. Cold-chamber tests were performed during which only cathode electron emission was considered without temperature rise or ion emission, current loss efficiency was estimated by comparisons between output magnetic field profiles with or without electron emission. PIC simulation results showed three stages of current transforming process with election emission in DPHC structure, the maximum ( 20%) current loss was 437kA at 15ns, while only 0.46% 0.48% was lost when driving current reached its peak. DPHC structure proved valuable functions during energy transform process in PTS facility, and NEPTUNE3D provided tools to explore this sophisticated physics. Project supported by the National Natural Science Foundation of China, Grant No. 11571293, 11505172.

  2. Investigations of high-speed optical transmission systems employing Absolute Added Correlative Coding (AACC)

    NASA Astrophysics Data System (ADS)

    Dong-Nhat, Nguyen; Elsherif, Mohamed A.; Malekmohammadi, Amin

    2016-07-01

    A novel multilevel modulation format based on partial-response signaling called Absolute Added Correlative Coding (AACC) is proposed and numerically demonstrated for high-speed fiber-optic communication systems. A bit error rate (BER) estimation model for the proposed multilevel format has also been developed. The performance of AACC is examined and compared against other prevailing On-Off-Keying and multilevel modulation formats e.g. non-return-to-zero (NRZ), 50% return-to-zero (RZ), 67% carrier-suppressed return-to-zero (CS-RZ), duobinary and four-level pulse-amplitude modulation (4-PAM) in terms of receiver sensitivity, spectral efficiency and dispersion tolerance. Calculated receiver sensitivity at a BER of 10-9 and chromatic dispersion tolerance of the proposed system are ˜-28.3 dBm and ˜336 ps/nm, respectively. The performance of AACC is delineated to be improved by 7.8 dB in terms of receiver sensitivity compared to 4-PAM in back-to-back scenario. The comparison results also show a clear advantage of AACC in achieving longer fiber transmission distance due to the higher dispersion tolerance in optical access networks.

  3. Investigating What Undergraduate Students Know About Science: Results from Complementary Strategies to Code Open-Ended Responses

    NASA Astrophysics Data System (ADS)

    Tijerino, K.; Buxner, S.; Impey, C.; CATS

    2013-04-01

    This paper presents new findings from an ongoing study of undergraduate student science literacy. Using data drawn from a 22 year project and over 11,000 student responses, we present how students' word usage in open-ended responses relates to what it means to study something scientifically. Analysis of students' responses show that they easily use words commonly associated with science, such as hypothesis, study, method, test, and experiment; but do these responses use scientific words knowledgeably? As with many multifaceted disciplines, demonstration of comprehension varies. This paper presents three different ways that student responses have been coded to investigate their understanding of science; 1) differentiating quality of a response with a coding scheme; 2) using word counting as an indicator of overall response strength; 3) responses are coded for quality of students' response. Building on previous research, comparison of science literacy and open-ended responses demonstrates that knowledge of science facts and vocabulary does not indicate a comprehension of the concepts behind these facts and vocabulary. This study employs quantitative and qualitative methods to systematically determine frequency and meaning of responses to standardized questions, and illustrates how students are able to demonstrate a knowledge of vocabulary. However, this knowledge is not indicative of conceptual understanding and poses important questions about how we assess students' understandings of science.

  4. Investigation of independence in inter-animal tumor-type occurrences within the NTP rodent-bioassay database

    SciTech Connect

    Bogen, K.T.; Seilkop, S.

    1993-05-01

    Statistically significant elevation in tumor incidence at multiple histologically distinct sites is occasionally observed among rodent bioassays of chemically induced carcinogenesis. If such data are to be relied on (as they have, e.g., by the US EPA) for quantitative cancer potency assessment, their proper analysis requires a knowledge of the extent to which multiple tumor-type occurrences are independent or uncorrelated within individual bioassay animals. Although difficult to assess in a statistically rigorous fashion, a few significant associations among tumor-type occurrences in rodent bioassays have been reported. However, no comprehensive studies of animal-specific tumor-type occurrences at death or sacrifice have been conducted using the extensive set of available NTP rodent-bioassay data, on which most cancer-potency assessment for environmental chemicals is currently based. This report presents the results of such an analysis conducted on behalf of the National Research Council`s Committee on Risk Assessment for Hazardous Air Pollutants. Tumor-type associations among individual animals were examined for {approximately}2500 to 3000 control and {approximately}200 to 600 treated animals using pathology data from 62 B6C3F1 mouse studies and 61 F/344N rat studies obtained from a readily available subset of the NTP carcinogenesis bioassay database. No evidence was found for any large correlation in either the onset probability or the prevalence-at-death or sacrifice of any tumor-type pair investigated in control and treated rats and niece, although a few of the small correlations present were statistically significant. Tumor-type occurrences were in most cases nearly independent, and departures from independence, where they did occur, were small. This finding is qualified in that tumor-type onset correlations were measured only indirectly, given the limited nature of the data analyzed.

  5. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  6. Investigating differences in DOAS retrieval codes using MAD-CAT campaign data

    NASA Astrophysics Data System (ADS)

    Peters, Enno; Pinardi, Gaia; Seyler, André; Richter, Andreas; Wittrock, Folkard; Bösch, Tim; Van Roozendael, Michel; Hendrick, François; Drosoglou, Theano; Bais, Alkiviadis F.; Kanaya, Yugo; Zhao, Xiaoyi; Strong, Kimberly; Lampel, Johannes; Volkamer, Rainer; Koenig, Theodore; Ortega, Ivan; Puentedura, Olga; Navarro-Comas, Mónica; Gómez, Laura; Yela González, Margarita; Piters, Ankie; Remmers, Julia; Wang, Yang; Wagner, Thomas; Wang, Shanshan; Saiz-Lopez, Alfonso; García-Nieto, David; Cuevas, Carlos A.; Benavent, Nuria; Querel, Richard; Johnston, Paul; Postylyakov, Oleg; Borovski, Alexander; Elokhov, Alexander; Bruchkouski, Ilya; Liu, Haoran; Liu, Cheng; Hong, Qianqian; Rivera, Claudia; Grutter, Michel; Stremme, Wolfgang; Fahim Khokhar, M.; Khayyam, Junaid; Burrows, John P.

    2017-03-01

    The differential optical absorption spectroscopy (DOAS) method is a well-known remote sensing technique that is nowadays widely used for measurements of atmospheric trace gases, creating the need for harmonization and characterization efforts. In this study, an intercomparison exercise of DOAS retrieval codes from 17 international groups is presented, focusing on NO2 slant columns. The study is based on data collected by one instrument during the Multi-Axis DOAS Comparison campaign for Aerosols and Trace gases (MAD-CAT) in Mainz, Germany, in summer 2013. As data from the same instrument are used by all groups, the results are free of biases due to instrumental differences, which is in contrast to previous intercomparison exercises.While in general an excellent correlation of NO2 slant columns between groups of > 99.98 % (noon reference fits) and > 99.2 % (sequential reference fits) for all elevation angles is found, differences between individual retrievals are as large as 8 % for NO2 slant columns and 100 % for rms residuals in small elevation angles above the horizon.Comprehensive sensitivity studies revealed that absolute slant column differences result predominantly from the choice of the reference spectrum while relative differences originate from the numerical approach for solving the DOAS equation as well as the treatment of the slit function. Furthermore, differences in the implementation of the intensity offset correction were found to produce disagreements for measurements close to sunrise (8-10 % for NO2, 80 % for rms residual). The largest effect of ≈ 8 % difference in NO2 was found to arise from the reference treatment; in particular for fits using a sequential reference. In terms of rms fit residual, the reference treatment has only a minor impact. In contrast, the wavelength calibration as well as the intensity offset correction were found to have the largest impact (up to 80 %) on rms residual while having only a minor impact on retrieved NO2

  7. Investigation into the flow field around a maneuvering submarine using a Reynolds-averaged Navier-Stokes code

    NASA Astrophysics Data System (ADS)

    Rhee, Bong

    The accurate and efficient prediction of hydrodynamic forces and moments on a maneuvering submarine has been achieved by investigating the flow physics involving the interaction of the vortical flow shed from the sail and the cross-flow boundary layer of the hull. In this investigation, a Reynolds-Averaged Navier-Stokes (RANS) computer code is used to simulate the most important physical effects related to maneuvering. It is applied to a generic axisymmetric body with the relatively simple case of the flow around an unappended hull at an angle of attack. After the code is validated for this simple case, it is validated for the case of a submarine with various appendages attached to the hull moving at an angle of drift. All six components of predicted forces and moments for various drift angles are compared with experimental data. Calculated pressure coefficients along the azimuthal angle are compared with experimental data and discussed to show the effect of the sail and the stern appendages. To understand the main flow features for a submarine in a straight flight, the RANS code is applied to simulate SUBOFF axisymmetric body at zero angle of attack in a straight-line basin. Pressure coefficient, skin friction coefficient, mean velocity components and the Reynolds shear stresses are compared with experimental data and discussed. The physical aspects of the interaction between the vortical flow shed by the sail and the cross-flow boundary layer on the hull are explained in greater detail. The understanding of this interaction is very important to characterize accurately the hydrodynamic behavior of a maneuvering submarine.

  8. Coexistence of two different pseudohypoparathyroidism subtypes (Ia and Ib) in the same kindred with independent Gsα coding mutations and GNAS imprinting defects

    PubMed Central

    Lecumberri, B; Fernández-Rebollo, E; Sentchordi, L; Saavedra, P; Bernal-Chico, A; Pallardo, L F; Jiménez Bustos, J M; Castaño, L; de Santiago, M; Hiort, O; Pérez de Nanclares, G; Bastepe, M

    2011-01-01

    Background Pseudohypoparathyroidism (PHP) defines a rare group of disorders whose common feature is resistance to the parathyroid hormone. Patients with PHP-Ia display additional hormone resistance, Albright hereditary osteodystrophy (AHO) and reduced Gsα activity in easily accessible cells. This form of PHP is associated with heterozygous inactivating mutations in Gsα-coding exons of GNAS, an imprinted gene locus on chromosome 20q13.3. Patients with PHP-Ib typically have isolated parathyroid hormone resistance, lack AHO features and demonstrate normal erythrocyte Gsα activity. Instead of coding Gsα mutations, patients with PHP-Ib display imprinting defects of GNAS, caused, at least in some cases, by genetic mutations within or nearby this gene. Patients Two unrelated PHP families, each of which includes at least one patient with a Gsα coding mutation and another with GNAS loss of imprinting, are reported here. Results One of the patients with GNAS imprinting defects has paternal uniparental isodisomy of chromosome 20q, explaining the observed imprinting abnormalities. The identified Gsα coding mutations include a tetranucleotide deletion in exon 7, which is frequently found in PHP-Ia, and a novel single nucleotide change at the acceptor splice junction of intron 11. Conclusions These molecular data reveal an interesting mixture, in the same family, of both genetic and epigenetic mutations of the same gene. PMID:19858129

  9. Performance investigation of the pulse and Campbelling modes of a fission chamber using a Poisson pulse train simulation code

    NASA Astrophysics Data System (ADS)

    Elter, Zs.; Jammes, C.; Pázsit, I.; Pál, L.; Filliatre, P.

    2015-02-01

    The detectors of the neutron flux monitoring system of the foreseen French GEN-IV sodium-cooled fast reactor (SFR) will be high temperature fission chambers placed in the reactor vessel in the vicinity of the core. The operation of a fission chamber over a wide-range neutron flux will be feasible provided that the overlap of the applicability of its pulse and Campbelling operational modes is ensured. This paper addresses the question of the linearity of these two modes and it also presents our recent efforts to develop a specific code for the simulation of fission chamber pulse trains. Our developed simulation code is described and its overall verification is shown. An extensive quantitative investigation was performed to explore the applicability limits of these two standard modes. It was found that for short pulses the overlap between the pulse and Campbelling modes can be guaranteed if the standard deviation of the background noise is not higher than 5% of the pulse amplitude. It was also shown that the Campbelling mode is sensitive to parasitic noise, while the performance of the pulse mode is affected by the stochastic amplitude distributions.

  10. Investigation of Nuclear Data Libraries with TRIPOLI-4 Monte Carlo Code for Sodium-cooled Fast Reactors

    NASA Astrophysics Data System (ADS)

    Lee, Y.-K.; Brun, E.

    2014-04-01

    The Sodium-cooled fast neutron reactor ASTRID is currently under design and development in France. Traditional ECCO/ERANOS fast reactor code system used for ASTRID core design calculations relies on multi-group JEFF-3.1.1 data library. To gauge the use of ENDF/B-VII.0 and JEFF-3.1.1 nuclear data libraries in the fast reactor applications, two recent OECD/NEA computational benchmarks specified by Argonne National Laboratory were calculated. Using the continuous-energy TRIPOLI-4 Monte Carlo transport code, both ABR-1000 MWth MOX core and metallic (U-Pu) core were investigated. Under two different fast neutron spectra and two data libraries, ENDF/B-VII.0 and JEFF-3.1.1, reactivity impact studies were performed. Using JEFF-3.1.1 library under the BOEC (Beginning of equilibrium cycle) condition, high reactivity effects of 808 ± 17 pcm and 1208 ± 17 pcm were observed for ABR-1000 MOX core and metallic core respectively. To analyze the causes of these differences in reactivity, several TRIPOLI-4 runs using mixed data libraries feature allow us to identify the nuclides and the nuclear data accounting for the major part of the observed reactivity discrepancies.

  11. Field dependence-independence from a working memory perspective: A dual-task investigation of the Hidden Figures Test.

    PubMed

    Miyake, Akira; Witzki, Alexander H.; Emerson, Michael J.

    2001-07-01

    Field dependence-independence (FDI) is a construct intensively investigated within cognitive style research, but its cognitive underpinnings are not clearly specified. We propose that performance on FDI tasks primarily reflects the operations of the visuospatial and executive components of working memory. We tested this hypothesis in a dual-task experiment with a commonly used measure of FDI, the Hidden Figures Test. The results showed that performance on this test was impaired by concurrent performance of secondary tasks that primarily tap the visuospatial component (spatial tapping) and the executive component (2-back and random number generation), but was almost unaffected by other secondary tasks (simple tapping and articulatory suppression). Moreover, an analysis of secondary task performance ruled out the possibility of strategic trade-offs and revealed an intriguing dissociation for two different sets of "randomness" indices for the random number generation task. These results support the hypothesised mapping between FDI and working memory components and suggest that the dual-task paradigm can provide a useful way to bring underspecified constructs like FDI into closer alignment with theoretical ideas developed within cognitive psychology.

  12. Field dependence-independence from a working memory perspective: a dual-task investigation of the Hidden Figures test.

    PubMed

    Miyake, A; Witzki, A H; Emerson, M J

    2001-01-01

    Field dependence-independence (FDI) is a construct intensively investigated within cognitive style research, but its cognitive underpinnings are not clearly specified. We propose that performance on FDI tasks primarily reflects the operations of the visuospatial and executive components of working memory. We tested this hypothesis in a dual-task experiment with a commonly used measure of FDI, the Hidden Figures Test. The results showed that performance on this test was impaired by concurrent performance of secondary tasks that primarily tap the visuospatial component (spatial tapping) and the executive component (2-back and random number generation), but was almost unaffected by other secondary tasks (simple tapping and articulatory suppression). Moreover, an analysis of secondary task performance ruled out the possibility of strategic trade-offs and revealed an intriguing dissociation for two different sets of "randomness" indices for the random number generation task. These results support the hypothesised mapping between FDI and working memory components and suggest that the dual-task paradigm can provide a useful way to bring underspecified constructs like FDI into closer alignment with theoretical ideas developed within cognitive psychology.

  13. WHPA Code available

    NASA Astrophysics Data System (ADS)

    The Wellhead Protection Area code is now available for distribution by the International Ground Water Modeling Center in Indianapolis, Ind. The WHPA code is a modular, semianalytical, groundwater flow model developed for the U.S. Environmental Protection Agency, Office of Ground Water Protection, designed to assist state and local technical staff with the task of Wellhead Protection Area (WHPA) delineation. A complete news item appeared in Eos, May 1, 1990, p. 690.The model consists of four independent, semianalytical modules that may be used to identify the areal extent of groundwater contribution to one or multiple pumping wells. One module is a general particle tracking program that may be used as a post-processor for two-dimensional, numerical models of groundwater flow. One module incorporates a Monte Carlo approach to investigate the effects of uncertain input parameters on capture zones. Multiple pumping and injection wells may be present and barrier or stream boundary conditions may be investigated.

  14. WHPA Code available

    NASA Astrophysics Data System (ADS)

    The Wellhead Protection Area (WHPA) code is now available for distribution by the International Ground Water Modeling Center in Indianapolis, Ind. The WHPA code is a modular, semi-analytical, groundwater flow model developed for the U.S. Environmental Protection Agency, Office of Ground Water Protection. It is designed to assist state and local technical staff with the task of WHPA delineation.The model consists of four independent, semi-analytical modules that may be used to identify the areal extent of groundwater contribution to one or multiple pumping wells. One module is a general particle tracking program that may be used as a post-processor for two-dimensional, numerical models of groundwater flow. One module incorporates a Monte Carlo approach to investigate the effects of uncertain input parameters on capture zones. Multiple pumping and injection wells may be present and barrier or stream boundary conditions may be investigated.

  15. A randomised, independent groups study investigating the sympathetic nervous system responses to two manual therapy treatments in patients with LBP.

    PubMed

    Perry, Jo; Green, Ann; Singh, Sally; Watson, Paul

    2015-12-01

    Manual therapy (MT) and exercise therapy techniques are commonly utilised, guideline recommended treatment strategies in the management of non-specific low back pain (LBP). Preliminary evidence on asymptomatic participants indicates that two manual therapy techniques; repeated lumbar extension in lying exercise (EIL); and segmental rotational grade V manipulation (manipulation), have significant effects on the sympathetic nervous system (SNS) as detectable with skin conductance (SC) responses. However, it is not known if these responses occur in patients with LBP. A randomised, independent group's design was utilised to investigate the immediate SC responses in 50 patients with LBP of less than 12 weeks duration. Patients received either the manipulation technique (n = 25) or the EIL exercise (n = 25) and SC activity was recorded, in a single treatment session, pre-, peri- and post-treatment. Both treatments resulted in a sympatho-excitatory response during the intervention period with the manipulation technique having a 255% increase (p < 0.005), and the EIL technique a 94% increase (p = 0.019) with both treatments having responses that were sustained into the final rest period (p < 0.005). Between-group comparisons indicate that the manipulation technique had a significantly greater magnitude of effect (p < 0.001). The results support the sympatho-excitatory responses seen in normative studies but challenge the assumption that normative and patient populations are analogous with respect to the magnitude of effect observed and suggest that SC responses may be a feasible, proxy method of detecting dorsal horn sensitisation and neuro-plastic adaptations occurring in the presence of LBP.

  16. Experimental investigation of a 10-percent-thick helicopter rotor airfoil section designed with a viscous transonic analysis code

    NASA Technical Reports Server (NTRS)

    Noonan, K. W.

    1981-01-01

    An investigation was conducted in the Langley 6- by 28-Inch Transonic Tunnel to determine the two dimensional aerodynamic characteristics of a 10-percent-thick helicopter rotor airfoil at Mach numbers from 0.33 to 0.87 and respective Reynolds numbers from 4.9 x 10 to the 6th to 9.8 x 10 to the 6th. This airfoil, designated the RC-10(N)-1, was also investigated at Reynolds numbers from 3.0 x 10 to the 6th to 7.3 x 10 to the 6th at respective Mach numbers of 0.33 to 0.83 for comparison wit the SC 1095 (with tab) airfoil. The RC-10(N)-1 airfoil was designed by the use of a viscous transonic analysis code. The results of the investigation indicate that the RC-10(N)-1 airfoil met all the design goals. At a Reynolds number of about 9.4 x 10 to the 6th the drag divergence Mach number at zero normal-force coefficient was 0.815 with a corresponding pitching-moment coefficient of zero. The drag divergence Mach number at a normal-force coefficient of 0.9 and a Reynolds number of about 8.0 x 10 to the 6th was 0.61. The drag divergence Mach number of this new airfoil was higher than that of the SC 1095 airfoil at normal-force coefficients above 0.3. Measurements in the same wind tunnel at comparable Reynolds numbers indicated that the maximum normal-force coefficient of the RC-10(N)-1 airfoil was higher than that of the NACA 0012 airfoil for Mach numbers above about 0.35 and was about the same as that of the SC 1095 airfoil for Mach numbers up to 0.5.

  17. Code-Switching in Japanese Language Classrooms: An Exploratory Investigation of Native vs. Non-Native Speaker Teacher Practice

    ERIC Educational Resources Information Center

    Hobbs, Valerie; Matsuo, Ayumi; Payne, Mark

    2010-01-01

    Research on language classroom code-switching ranges from describing both teachers' and learners' first language and target language use to making connections between code-switching and student learning. However, few studies compare differences in practice between native and non-native speaker teachers and even fewer consider culture of learning…

  18. Investigation of host-cell influence on polyomavirus biological activity and the major virus-coded structural protein VP1

    SciTech Connect

    Ludlow, J.W.

    1987-01-01

    The host cell influence on polyomavirus biological activity and modification of the major virus-coded structural protein VP1 was investigated. Polyoma virions resulting from the permissive infections of primary mouse kidney cells, as compared to virus progeny from primary mouse embryo cells, exhibited a ten-fold greater ability to agglutinate guinea pig erythrocytes, a three-fold lower ability to become internalized into monopincytotic vesicles, and a two-fold lower ability to initiate a productive infection based on positive nuclear immunofluorescence when using mouse embryo host cells. Mouse kidney cell progeny were found to bind to host cells less specifically than mouse-embryo cell progeny. Two-dimensional gel electrophoresis of VP1, following in vivo labeling with /sup 32/P, revealed that species D and E of the mouse kidney cell progeny were phosphorylated to the same degree while mouse embryo cell progeny VP1 species E and F were phosphorylated equally. In vivo labeling of polyomavirus with /sup 35/S-sulfate, followed by gel electrophoretic analysis of the structural proteins and two-dimensional chromatographic analysis of the VP1 amino acids revealed that species E and F are modified by sulfation of tyrosine. These data suggest that host cells play a role in modulating biological activity of the virus by affecting the degree and site specific modification of the major capsid protein VP1. Modification of this protein may influence the recognition of virus attachment proteins for specific cellular receptors as well as other biological functions.

  19. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  20. Investigation of fast ion behavior using orbit following Monte-Carlo code in magnetic perturbed field in KSTAR

    NASA Astrophysics Data System (ADS)

    Shinohara, Kouji; Suzuki, Yasuhiro; Kim, Junghee; Kim, Jun Young; Jeon, Young Mu; Bierwage, Andreas; Rhee, Tongnyeol

    2016-11-01

    The fast ion dynamics and the associated heat load on the plasma facing components in the KSTAR tokamak were investigated with the orbit following Monte-Carlo (OFMC) code in several magnetic field configurations and realistic wall geometry. In particular, attention was paid to the effect of resonant magnetic perturbation (RMP) fields. Both the vacuum field approximation as well as the self-consistent field that includes the response of a stationary plasma were considered. In both cases, the magnetic perturbation (MP) is dominated by the toroidal mode number n  =  1, but otherwise its structure is strongly affected by the plasma response. The loss of fast ions increased significantly when the MP field was applied. Most loss particles hit the poloidal limiter structure around the outer mid-plane on the low field side, but the distribution of heat loads across the three limiters varied with the form of the MP. Short-timescale loss of supposedly well-confined co-passing fast ions was also observed. These losses started within a few poloidal transits after the fast ion was born deep inside the plasma on the high-field side of the magnetic axis. In the configuration studied, these losses are facilitated by the combination of two factors: (i) the large magnetic drift of fast ions across a wide range of magnetic surfaces due to a low plasma current, and (ii) resonant interactions between the fast ions and magnetic islands that were induced inside the plasma by the external RMP field. These effects are expected to play an important role in present-day tokamaks.

  1. Organisational factors affecting the quality of hospital clinical coding.

    PubMed

    Santos, Suong; Murphy, Gregory; Baxter, Kathryn; Robinson, Kerin M

    2008-01-01

    The influence of organisational factors on the quality of hospital coding using the International Statistical Classification of Diseases and Health Related Problems, 10th Revision, Australian Modification (ICD-10-AM) was investigated using a mixed quantitative-qualitative approach. The organisational variables studied were: hospital specialty; geographical locality; structural characteristics of the coding unit; education, training and resource supports for Clinical Coders; and quality control mechanisms. Baseline data on the hospitals' coding quality, measured by the Performance Indicators for Coding Quality tool, were used as an independent index measure. No differences were found in error rates between rural and metropolitan hospitals, or general and specialist hospitals. Clinical Coder allocation to "general" rather than "specialist" unit coding resulted in fewer errors. Coding Managers reported that coding quality can be improved by: Coders engaging in a variety of role behaviours; improved Coder career opportunities; higher staffing levels; reduced throughput; fewer time constraints on coding outputs and associated work; and increased Coder interactions with medical staff.

  2. CYP2B6 non-coding variation associated with smoking cessation is also associated with differences in allelic expression, splicing, and nicotine metabolism independent of common amino-acid changes.

    PubMed

    Bloom, A Joseph; Martinez, Maribel; Chen, Li-Shiun; Bierut, Laura J; Murphy, Sharon E; Goate, Alison

    2013-01-01

    The Cytochrome P450 2B6 (CYP2B6) enzyme makes a small contribution to hepatic nicotine metabolism relative to CYP2A6, but CYP2B6 is the primary enzyme responsible for metabolism of the smoking cessation drug bupropion. Using CYP2A6 genotype as a covariate, we find that a non-coding polymorphism in CYP2B6 previously associated with smoking cessation (rs8109525) is also significantly associated with nicotine metabolism. The association is independent of the well-studied non-synonymous variants rs3211371, rs3745274, and rs2279343 (CYP2B6*5 and *6). Expression studies demonstrate that rs8109525 is also associated with differences in CYP2B6 mRNA expression in liver biopsy samples. Splicing assays demonstrate that specific splice forms of CYP2B6 are associated with haplotypes defined by variants including rs3745274 and rs8109525. These results indicate differences in mRNA expression and splicing as potential molecular mechanisms by which non-coding variation in CYP2B6 may affect enzymatic activity leading to differences in metabolism and smoking cessation.

  3. Inter-Sentential Patterns of Code-Switching: A Gender-Based Investigation of Male and Female EFL Teachers

    ERIC Educational Resources Information Center

    Gulzar, Malik Ajmal; Farooq, Muhammad Umar; Umer, Muhammad

    2013-01-01

    This article has sought to contribute to discussions concerning the value of inter-sentential patterns of code-switching (henceforth ISPCS) particularly in the context of EFL classrooms. Through a detailed analysis of recorded data produced in that context, distinctive features in the discourse were discerned which were associated with males' and…

  4. Polar Codes

    DTIC Science & Technology

    2014-12-01

    density parity check (LDPC) code, a Reed–Solomon code, and three convolutional codes. iii CONTENTS EXECUTIVE SUMMARY...the most common. Many civilian systems use low density parity check (LDPC) FEC codes, and the Navy is planning to use LDPC for some future systems...other forward error correction methods: a turbo code, a low density parity check (LDPC) code, a Reed–Solomon code, and three convolutional codes

  5. Supporting the Cybercrime Investigation Process: Effective Discrimination of Source Code Authors Based on Byte-Level Information

    NASA Astrophysics Data System (ADS)

    Frantzeskou, Georgia; Stamatatos, Efstathios; Gritzalis, Stefanos

    Source code authorship analysis is the particular field that attempts to identify the author of a computer program by treating each program as a linguistically analyzable entity. This is usually based on other undisputed program samples from the same author. There are several cases where the application of such a method could be of a major benefit, such as tracing the source of code left in the system after a cyber attack, authorship disputes, proof of authorship in court, etc. In this paper, we present our approach which is based on byte-level n-gram profiles and is an extension of a method that has been successfully applied to natural language text authorship attribution. We propose a simplified profile and a new similarity measure which is less complicated than the algorithm followed in text authorship attribution and it seems more suitable for source code identification since is better able to deal with very small training sets. Experiments were performed on two different data sets, one with programs written in C++ and the second with programs written in Java. Unlike the traditional language-dependent metrics used by previous studies, our approach can be applied to any programming language with no additional cost. The presented accuracy rates are much better than the best reported results for the same data sets.

  6. Code Disentanglement: Initial Plan

    SciTech Connect

    Wohlbier, John Greaton; Kelley, Timothy M.; Rockefeller, Gabriel M.; Calef, Matthew Thomas

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  7. Independent Technical Investigation of the Puna Geothermal Venture Unplanned Steam Release, June 12 and 13, 1991, Puna, Hawaii

    SciTech Connect

    Thomas, Richard; Whiting, Dick; Moore, James; Milner, Duey

    1991-07-01

    On June 24, 1991, a third-party investigation team consisting of Richard P. Thomas, Duey E. Milner, James L. Moore, and Dick Whiting began an investigation into the blowout of well KS-8, which occurred at the Puna Geothermal Venture (PGV) site on June 12, 1991, and caused the unabated release of steam for a period of 31 hours before PGV succeeded in closing in the well. The scope of the investigation was to: (a) determine the cause(s) of the incident; (b) evaluate the adequacy of PGVs drilling and blowout prevention equipment and procedures; and (c) make recommendations for any appropriate changes in equipment and/or procedures. This report finds that the blowout occurred because of inadequacies in PGVs drilling plan and procedures and not as a result of unusual or unmanageable subsurface geologic or hydrologic conditions. While the geothermal resource in the area being drilled is relatively hot, the temperatures are not excessive for modem technology and methods to control. Fluid pressures encountered are also manageable if proper procedures are followed and the appropriate equipment is utilized. A previous blowout of short duration occurred on February 21, 1991, at the KS-7 injection well being drilled by PGV at a depth of approximately 1600'. This unexpected incident alerted PGV to the possibility of encountering a high temperature, fractured zone at a relatively shallow depth. The experience at KS-7 prompted PGV to refine its hydrological model; however, the drilling plan utilized for KS-8 was not changed. Not only did PGV fail to modify its drilling program following the KS-7 blowout, but they also failed to heed numerous ''red flags'' (warning signals) in the five days preceding the KS-8 blowout, which included a continuous 1-inch flow of drilling mud out of the wellbore, gains in mud volume while pulling stands, and gas entries while circulating muds bottoms up, in addition to lost circulation that had occurred earlier below the shoe of the 13-3/8-hch casing.

  8. Clinical coding. Code breakers.

    PubMed

    Mathieson, Steve

    2005-02-24

    --The advent of payment by results has seen the role of the clinical coder pushed to the fore in England. --Examinations for a clinical coding qualification began in 1999. In 2004, approximately 200 people took the qualification. --Trusts are attracting people to the role by offering training from scratch or through modern apprenticeships.

  9. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  10. A universal primer-independent next-generation sequencing approach for investigations of norovirus outbreaks and novel variants.

    PubMed

    Fonager, Jannik; Stegger, Marc; Rasmussen, Lasse Dam; Poulsen, Mille Weismann; Rønn, Jesper; Andersen, Paal Skytt; Fischer, Thea Kølsen

    2017-04-11

    Norovirus (NoV) is the most common cause of non-bacterial gastroenteritis and is a major agent associated with outbreaks of gastroenteritis. Conventional molecular genotyping analysis of NoV, used for the identification of transmission routes, relies on standard typing methods (STM) by Sanger-sequencing of only a limited part of the NoV genome, which could lead to wrong conclusions. Here, we combined a NoV capture method with next generation sequencing (NGS), which increased the proportion of norovirus reads by ~40 fold compared to NGS without prior capture. Of 15 NoV samples from 6 single-genotype outbreaks, near full-genome coverage (>90%) was obtained from 9 samples. Fourteen polymerase (RdRp) and 15 capsid (cap) genotypes were identified compared to 12 and 13 for the STM, respectively. Analysis of 9 samples from two mixed-genotype outbreaks identified 6 RdRp and 6 cap genotypes (two at >90% NoV genome coverage) compared to 4 and 2 for the STM, respectively. Furthermore, complete or partial sequences from the P2 hypervariable region were obtained from 7 of 8 outbreaks and a new NoV recombinant was identified. This approach could therefore strengthen outbreak investigations and could be applied to other important viruses in stool samples such as hepatitis A and enterovirus.

  11. The disposition of impact ejecta resulting from the AIDA-DART mission to binary asteroid 65803 Didymos: an independent investigation

    NASA Astrophysics Data System (ADS)

    Richardson, James E.; O'Brien, David P.

    2016-10-01

    If all goes as planned, in the year 2020 a joint ESA and NASA mission will be launched that will rendezvous with the near-Earth binary asteroid system 65803 Didymos in the fall of 2022. The European component, the Asteroid Impact & Deflection Assessment (AIDA) spacecraft will arrive first and characterize the system, which consists of a ~800 m diameter primary and a ~160 m diameter secondary, orbiting a common center of mass at a semi-major axis distance of ~1200 m with a orbital period of 11.9 hr. Following system characterization, the AIDA spacecraft will remove to a safe distance while the NASA component, the 300 kg Double Asteroid Redirection Test (DART) spacecraft collides with the trailing edge of the secondary body (with respect to the binary's retrograde mutual orbit). Meanwhile, the AIDA spacecraft will conduct observations of this impact and its aftermath, specifically looking for changes made to the primary, the secondary, and their mutual orbit as a result of the DART collision. Of particular interest is the ballistic flight and final disposition of the ejecta produced by the impact cratering process, not just from the standpoint of scientific study, but also from the standpoint of AIDA spacecraft safety.In this study, we investigate a series of hypothetical DART impacts utilizing a semi-empirical, numerical impact ejecta plume model originally developed for the Deep Impact mission and designed specifically with impacts on small bodies in mind. The resulting excavated mass is discretized into 7200 individual tracer particles, each representing a unique combination of speed, mass, and ejected direction. The trajectory of each tracer is computed numerically under the gravitational influence of both primary and secondary, along with the effects of solar radiation pressure. Each tracer is followed until it either impacts a body or escapes the system, whereupon tracking is continued in the heliocentric frame using an N-body integrator. Various impact

  12. Investigation of island formation due to RMPs in DIII-D plasmas with the SIESTA resistive MHD equilibrium code

    SciTech Connect

    Hirshman, S. P.; Shafer, M. W.; Seal, S. K.; Canik, J. M.

    2016-03-03

    The SIESTA magnetohydrodynamic (MHD) equilibrium code has been used to compute a sequence of ideally stable equilibria resulting from numerical variation of the helical resonant magnetic perturbation (RMP) applied to an axisymmetric DIII-D plasma equilibrium. Increasing the perturbation strength at the dominant m=2, n=-1 , resonant surface leads to lower MHD energies and increases in the equilibrium island widths at the m=2 (and sidebands) surfaces, in agreement with theoretical expectations. Island overlap at large perturbation strengths leads to stochastic magnetic fields which correlate well with the experimentally inferred field structure. The magnitude and spatial phase (around the dominant rational surfaces) of the resonant (shielding) component of the parallel current are shown to change qualitatively with the magnetic island topology.

  13. Investigation of island formation due to RMPs in DIII-D plasmas with the SIESTA resistive MHD equilibrium code

    DOE PAGES

    Hirshman, S. P.; Shafer, M. W.; Seal, S. K.; ...

    2016-03-03

    The SIESTA magnetohydrodynamic (MHD) equilibrium code has been used to compute a sequence of ideally stable equilibria resulting from numerical variation of the helical resonant magnetic perturbation (RMP) applied to an axisymmetric DIII-D plasma equilibrium. Increasing the perturbation strength at the dominant m=2, n=-1 , resonant surface leads to lower MHD energies and increases in the equilibrium island widths at the m=2 (and sidebands) surfaces, in agreement with theoretical expectations. Island overlap at large perturbation strengths leads to stochastic magnetic fields which correlate well with the experimentally inferred field structure. The magnitude and spatial phase (around the dominant rational surfaces)more » of the resonant (shielding) component of the parallel current are shown to change qualitatively with the magnetic island topology.« less

  14. Design and performance investigation of LDPC-coded upstream transmission systems in IM/DD OFDM-PONs

    NASA Astrophysics Data System (ADS)

    Gong, Xiaoxue; Guo, Lei; Wu, Jingjing; Ning, Zhaolong

    2016-12-01

    In Intensity-Modulation Direct-Detection (IM/DD) Orthogonal Frequency Division Multiplexing Passive Optical Networks (OFDM-PONs), aside from Subcarrier-to-Subcarrier Intermixing Interferences (SSII) induced by square-law detection, the same laser frequency for data sending from Optical Network Units (ONUs) results in ONU-to-ONU Beating Interferences (OOBI) at the receiver. To mitigate those interferences, we design a Low-Density Parity Check (LDPC)-coded and spectrum-efficient upstream transmission system. A theoretical channel model is also derived, in order to analyze the detrimental factors influencing system performances. Simulation results demonstrate that the receiver sensitivity is improved 3.4 dB and 2.5 dB under QPSK and 8QAM, respectively, after 100 km Standard Single-Mode Fiber (SSMF) transmission. Furthermore, the spectrum efficiency can be improved by about 50%.

  15. The effects of left or right hemispheric epilepsy on language networks investigated with semantic decision fMRI task and independent component analysis.

    PubMed

    Karunanayaka, Prasanna; Kim, Kwang Ki; Holland, Scott K; Szaflarski, Jerzy P

    2011-04-01

    Chronic and progressive brain injury, as seen in epilepsy, may alter brain networks that underlie cognitive functions. To evaluate the effect of epilepsy on language functions we investigated the neuroanatomical basis of semantic processing in patients with left (LHE) or right (RHE) hemispheric onset epilepsy using semantic decision fMRI paradigm and group independent component analysis (ICA); we then compared the results of our investigations with language networks in healthy subjects examined with the same language task (Kim K, Karunanayaka P, Privitera M, Holland S, Szaflarski J. Semantic association investigated with fMRI and independent component analysis. In press). Group ICA is a data-driven technique capable of revealing the functional organization of the human brain based on fMRI data. In addition to providing functional connectivity information, ICA can also provide information about the temporal dynamics of underlying networks subserving specific cognitive functions. In this study, we implemented two complementary analyses to investigate group differences in underlying network dynamics based on associated independent component (IC) time courses (a priori defined criterion or a posteriori identified maximum likelihood descriptor). We detected several differences between healthy controls and patients with epilepsy not previously observed with standard fMRI analysis methods. Our analyses confirmed the presence of different effects of LHE or RHE on the behavior of the language network. In particular, a major difference was noted in the nodes subserving verbal encoding and retrieval in the bilateral medial temporal regions. These effects were dependent on the side of the epilepsy onset; that is, effects were different with left or right hemispheric epilepsy. These findings may explain the differences in verbal and nonverbal memory abilities between patients with left and those with right hemispheric epilepsy. Further, although the effects on other nodes of

  16. Investigating Physical Activity in the Etiology of Pancreatic Cancer: The Age at Which This is Measured is Important and is Independent of Body Mass Index

    PubMed Central

    Noor, Nurulamin M.; Banim, Paul J.R.; Luben, Robert N.; Khaw, Kay-Tee; Hart, Andrew R.

    2015-01-01

    Objectives There are plausible biological mechanisms for how increased physical activity (PA) may prevent pancreatic cancer, although findings from epidemiological studies are inconsistent. We investigated whether the risk is dependent on the age at which PA is measured, and if independent of body mass index (BMI). Methods 23 639 participants, aged 40-74 years, were recruited into the EPIC-Norfolk cohort study between 1993 and 1997 and completed validated questionnaires on PA. The cohort was monitored for pancreatic cancer development and hazard ratios (HRs) were estimated, adjusted for covariates. Results Within 17 years, 88 participants developed pancreatic cancer (55% female). There was no association between PA and risk in the cohort (HR trend=1.06, 95% CI=0.86-1.29). However, in participants younger than 60 years, higher PA was associated with decreased risk (highest vs. lowest category HR=0.27, 95% CI=0.07-0.99). Higher PA was not inversely associated when older than 60 years (HR trend=1.23, 95% CI=0.96-1.57). Including BMI in all models, produced similar estimates. Conclusions The reasons why PA in younger, but not older people, may prevent pancreatic cancer needs to be investigated. PA may operate through mechanisms independent of BMI. If this association is causal, one in six cases might be prevented by encouraging more PA. PMID:26390426

  17. Investigation of island formation due to RMPs in DIII-D plasmas with the SIESTA resistive MHD equilibrium code

    SciTech Connect

    Hirshman, S. P.; Shafer, M. W.; Seal, S. K.; Canik, J. M.

    2016-03-03

    The SIESTA magnetohydrodynamic (MHD) equilibrium code has been used to compute a sequence of ideally stable equilibria resulting from numerical variation of the helical resonant magnetic perturbation (RMP) applied to an axisymmetric DIII-D plasma equilibrium. Increasing the perturbation strength at the dominant $m=2$,$n=-1$ resonant surface leads to lower MHD energies and increases in the equilibrium island widths at the $m=2$ (and sidebands) surfaces, in agreement with theoretical expectations. Island overlap at large perturbation strengths leads to stochastic magnetic fields which correlate well with the experimentally inferred field structure. The magnitude and spatial phase (around the dominant rational surfaces) of the resonant (shielding) component of the parallel current are shown to change qualitatively with the magnetic island topology.

  18. Are Independent Probes Truly Independent?

    ERIC Educational Resources Information Center

    Camp, Gino; Pecher, Diane; Schmidt, Henk G.; Zeelenberg, Rene

    2009-01-01

    The independent cue technique has been developed to test traditional interference theories against inhibition theories of forgetting. In the present study, the authors tested the critical criterion for the independence of independent cues: Studied cues not presented during test (and unrelated to test cues) should not contribute to the retrieval…

  19. Investigations to determine whether Section XI of the ASME (American Society of Mechanical Engineers) Boiler and Pressure Vessel Code should include PLEX (plant life extension) baseline inspection guidance

    SciTech Connect

    Bustard, L.D.

    1988-01-01

    A plant life extension (PLEX) issue repeatedly mentioned is whether special PLEX supplemental inspection requirements should be added to Section XI of the ASME Boiler and Pressure Vessel Code. To assist the ASME answer this question, the DOE Technology Management Center performed an industry survey to assess whether there was a technical consensus regarding the desirability and scope of a supplemental PLEX baseline inspection. This survey demonstrated the lack of an initial industry consensus. In response to the survey results, ASME has formed a task group to investigate various PLEX supplemental inspection strategies and to assess their value and liabilities. The results of the survey and initial task group activities are reviewed.

  20. A gene-environment investigation on personality traits in two independent clinical sets of adult patients with personality disorder and attention deficit/hyperactive disorder.

    PubMed

    Jacob, Christian P; Nguyen, Thuy Trang; Dempfle, Astrid; Heine, Monika; Windemuth-Kieselbach, Christine; Baumann, Katarina; Jacob, Florian; Prechtl, Julian; Wittlich, Maike; Herrmann, Martin J; Gross-Lesch, Silke; Lesch, Klaus-Peter; Reif, Andreas

    2010-06-01

    While an interactive effect of genes with adverse life events is increasingly appreciated in current concepts of depression etiology, no data are presently available on interactions between genetic and environmental (G x E) factors with respect to personality and related disorders. The present study therefore aimed to detect main effects as well as interactions of serotonergic candidate genes (coding for the serotonin transporter, 5-HTT; the serotonin autoreceptor, HTR1A; and the enzyme which synthesizes serotonin in the brain, TPH2) with the burden of life events (#LE) in two independent samples consisting of 183 patients suffering from personality disorders and 123 patients suffering from adult attention deficit/hyperactivity disorder (aADHD). Simple analyses ignoring possible G x E interactions revealed no evidence for associations of either #LE or of the considered polymorphisms in 5-HTT and TPH2. Only the G allele of HTR1A rs6295 seemed to increase the risk of emotional-dramatic cluster B personality disorders (p = 0.019, in the personality disorder sample) and to decrease the risk of anxious-fearful cluster C personality disorders (p = 0.016, in the aADHD sample). We extended the initial simple model by taking a G x E interaction term into account, since this approach may better fit the data indicating that the effect of a gene is modified by stressful life events or, vice versa, that stressful life events only have an effect in the presence of a susceptibility genotype. By doing so, we observed nominal evidence for G x E effects as well as main effects of 5-HTT-LPR and the TPH2 SNP rs4570625 on the occurrence of personality disorders. Further replication studies, however, are necessary to validate the apparent complexity of G x E interactions in disorders of human personality.

  1. Investigating mitochondrial metabolism in contracting HL-1 cardiomyocytes following hypoxia and pharmacological HIF activation identifies HIF-dependent and independent mechanisms of regulation.

    PubMed

    Ambrose, Lucy J A; Abd-Jamil, Amira H; Gomes, Renata S M; Carter, Emma E; Carr, Carolyn A; Clarke, Kieran; Heather, Lisa C

    2014-11-01

    Hypoxia is a consequence of cardiac disease and downregulates mitochondrial metabolism, yet the molecular mechanisms through which this occurs in the heart are incompletely characterized. Therefore, we aimed to use a contracting HL-1 cardiomyocyte model to investigate the effects of hypoxia on mitochondrial metabolism. Cells were exposed to hypoxia (2% O2) for 6, 12, 24, and 48 hours to characterize the metabolic response. Cells were subsequently treated with the hypoxia inducible factor (HIF)-activating compound, dimethyloxalylglycine (DMOG), to determine whether hypoxia-induced mitochondrial changes were HIF dependent or independent, and to assess the suitability of this cultured cardiac cell line for cardiovascular pharmacological studies. Hypoxic cells had increased glycolysis after 24 hours, with glucose transporter 1 and lactate levels increased 5-fold and 15-fold, respectively. After 24 hours of hypoxia, mitochondrial networks were more fragmented but there was no change in citrate synthase activity, indicating that mitochondrial content was unchanged. Cellular oxygen consumption was 30% lower, accompanied by decreases in the enzymatic activities of electron transport chain (ETC) complexes I and IV, and aconitase by 81%, 96%, and 72%, relative to controls. Pharmacological HIF activation with DMOG decreased cellular oxygen consumption by 43%, coincident with decreases in the activities of aconitase and complex I by 26% and 30%, indicating that these adaptations were HIF mediated. In contrast, the hypoxia-mediated decrease in complex IV activity was not replicated by DMOG treatment, suggesting HIF-independent regulation of this complex. In conclusion, 24 hours of hypoxia increased anaerobic glycolysis and decreased mitochondrial respiration, which was associated with changes in ETC and tricarboxylic acid cycle enzyme activities in contracting HL-1 cells. Pharmacological HIF activation in this cardiac cell line allowed both HIF-dependent and independent

  2. Multigenerational Independent Colony for Extraterrestrial Habitation, Autonomy, and Behavior Health (MICEHAB): An Investigation of a Long Duration, Partial Gravity, Autonomous Rodent Colony

    NASA Technical Reports Server (NTRS)

    Rodgers, Erica M.; Simon, Matthew A.; Antol, Jeffrey; Chai, Patrick R.; Jones, Christopher A.; Klovstad, Jordan J.; Neilan, James H.; Stillwagen, Frederic H.; Williams, Phillip A.; Bednara, Michael; Guendel, Alex; Hernandez, Joel; Lewis, Weston; Lim, Jeremy; Wilson, Logan; Wusk, Grace

    2015-01-01

    The path from Earth to Mars requires exploration missions to be increasingly Earth-independent as the foundation is laid for a sustained human presence in the following decades. NASA pioneering of Mars will expand the boundaries of human exploration, as a sustainable presence on the surface requires humans to successfully reproduce in a partial gravity environment independent from Earth intervention. Before significant investment is made in capabilities leading to such pioneering efforts, the challenges of multigenerational mammalian reproduction in a partial gravity environment need be investigated. The Multi-generational Independent Colony for Extraterrestrial Habitation, Autonomy, and Behavior health is designed to study these challenges. The proposed concept is a conceptual, long duration, autonomous habitat designed to house rodents in a partial gravity environment with the goal of understanding the effects of partial gravity on mammalian reproduction over multiple generations and how to effectively design such a facility to operate autonomously while keeping the rodents healthy in order to achieve multiple generations. All systems are designed to feed forward directly to full-scale human missions to Mars. This paper presents the baseline design concept formulated after considering challenges in the mission and vehicle architectures such as: vehicle automation, automated crew health management/medical care, unique automated waste disposal and hygiene, handling of deceased crew members, reliable long-duration crew support systems, and radiation protection. This concept was selected from an architectural trade space considering the balance between mission science return and robotic and autonomy capabilities. The baseline design is described in detail including: transportation and facility operation constraints, artificial gravity system design, habitat design, and a full-scale mock-up demonstration of autonomous rodent care facilities. The proposed concept has

  3. Extension of the supercritical carbon dioxide brayton cycle to low reactor power operation: investigations using the coupled anl plant dynamics code-SAS4A/SASSYS-1 liquid metal reactor code system.

    SciTech Connect

    Moisseytsev, A.; Sienicki, J. J.

    2012-05-10

    Significant progress has been made on the development of a control strategy for the supercritical carbon dioxide (S-CO{sub 2}) Brayton cycle enabling removal of power from an autonomous load following Sodium-Cooled Fast Reactor (SFR) down to decay heat levels such that the S-CO{sub 2} cycle can be used to cool the reactor until decay heat can be removed by the normal shutdown heat removal system or a passive decay heat removal system such as Direct Reactor Auxiliary Cooling System (DRACS) loops with DRACS in-vessel heat exchangers. This capability of the new control strategy eliminates the need for use of a separate shutdown heat removal system which might also use supercritical CO{sub 2}. It has been found that this capability can be achieved by introducing a new control mechanism involving shaft speed control for the common shaft joining the turbine and two compressors following reduction of the load demand from the electrical grid to zero. Following disconnection of the generator from the electrical grid, heat is removed from the intermediate sodium circuit through the sodium-to-CO{sub 2} heat exchanger, the turbine solely drives the two compressors, and heat is rejected from the cycle through the CO{sub 2}-to-water cooler. To investigate the effectiveness of shaft speed control, calculations are carried out using the coupled Plant Dynamics Code-SAS4A/SASSYS-1 code for a linear load reduction transient for a 1000 MWt metallic-fueled SFR with autonomous load following. No deliberate motion of control rods or adjustment of sodium pump speeds is assumed to take place. It is assumed that the S-CO{sub 2} turbomachinery shaft speed linearly decreases from 100 to 20% nominal following reduction of grid load to zero. The reactor power is calculated to autonomously decrease down to 3% nominal providing a lengthy window in time for the switchover to the normal shutdown heat removal system or for a passive decay heat removal system to become effective. However, the

  4. An investigation for population maintenance mechanism in a miniature garden: genetic connectivity or independence of small islet populations of the Ryukyu five-lined skink.

    PubMed

    Kurita, Kazuki; Hikida, Tsutomu; Toda, Mamoru

    2014-01-01

    The Ryukyu five-lined skink (Plestiodon marginatus) is an island lizard that is even found in tiny islets with less than half a hectare of habitat area. We hypothesized that the island populations are maintained under frequent gene flow among the islands or independent of each other. To test our hypotheses, we investigated genetic structure of 21 populations from 11 land-bridge islands that were connected during the latest glacial age, and 4 isolated islands. Analyses using mitochondrial cytochrome b gene sequence (n = 67) and 10 microsatellite loci (n = 235) revealed moderate to high levels of genetic differentiation, existence of many private alleles/haplotypes in most islands, little contemporary migration, a positive correlation between genetic variability and island area, and a negative correlation between relatedness and island area. These evidences suggest a strong effect of independent genetic drift as opposed to gene flow, favoring the isolation hypothesis even in tiny islet populations. Isolation-by-distance effect was demonstrated and it became more prominent when the 4 isolated islands were excluded, suggesting that the pattern is a remnant of the land-bridge age. In a few island populations, however, the possibility of occasional overwater dispersals was partially supported and therefore could not be ruled out.

  5. Numerical investigations on pressurized AL-composite vessel response to hypervelocity impacts: Comparison between experimental works and a numerical code

    NASA Astrophysics Data System (ADS)

    Mespoulet, Jérôme; Plassard, Fabien; Hereil, Pierre-Louis

    2015-09-01

    Response of pressurized composite-Al vessels to hypervelocity impact of aluminum spheres have been numerically investigated to evaluate the influence of initial pressure on the vulnerability of these vessels. Investigated tanks are carbon-fiber overwrapped prestressed Al vessels. Explored internal air pressure ranges from 1 bar to 300 bar and impact velocity are around 4400 m/s. Data obtained from experiments (Xray radiographies, particle velocity measurement and post-mortem vessels) have been compared to numerical results given from LS-DYNA ALE-Lagrange-SPH full coupling models. Simulations exhibit an under estimation in term of debris cloud evolution and shock wave propagation in pressurized air but main modes of damage/rupture on the vessels given by simulations are coherent with post-mortem recovered vessels from experiments. First results of this numerical work are promising and further simulation investigations with additional experimental data will be done to increase the reliability of the simulation model. The final aim of this crossed work is to numerically explore a wide range of impact conditions (impact angle, projectile weight, impact velocity, initial pressure) that cannot be explore experimentally. Those whole results will define a rule of thumbs for the definition of a vulnerability analytical model for a given pressurized vessel.

  6. System and method for investigating sub-surface features of a rock formation with acoustic sources generating coded signals

    SciTech Connect

    Vu, Cung Khac; Nihei, Kurt; Johnson, Paul A; Guyer, Robert; Ten Cate, James A; Le Bas, Pierre-Yves; Larmat, Carene S

    2014-12-30

    A system and a method for investigating rock formations includes generating, by a first acoustic source, a first acoustic signal comprising a first plurality of pulses, each pulse including a first modulated signal at a central frequency; and generating, by a second acoustic source, a second acoustic signal comprising a second plurality of pulses. A receiver arranged within the borehole receives a detected signal including a signal being generated by a non-linear mixing process from the first-and-second acoustic signal in a non-linear mixing zone within the intersection volume. The method also includes-processing the received signal to extract the signal generated by the non-linear mixing process over noise or over signals generated by a linear interaction process, or both.

  7. Salam's independence

    NASA Astrophysics Data System (ADS)

    Fraser, Gordon

    2009-01-01

    In his kind review of my biography of the Nobel laureate Abdus Salam (December 2008 pp45-46), John W Moffat wrongly claims that Salam had "independently thought of the idea of parity violation in weak interactions".

  8. Investigating the role of rare coding variability in Mendelian dementia genes (APP, PSEN1, PSEN2, GRN, MAPT, and PRNP) in late-onset Alzheimer's disease

    PubMed Central

    Sassi, Celeste; Guerreiro, Rita; Gibbs, Raphael; Ding, Jinhui; Lupton, Michelle K.; Troakes, Claire; Al-Sarraj, Safa; Niblock, Michael; Gallo, Jean-Marc; Adnan, Jihad; Killick, Richard; Brown, Kristelle S.; Medway, Christopher; Lord, Jenny; Turton, James; Bras, Jose; Morgan, Kevin; Powell, John F.; Singleton, Andrew; Hardy, John

    2014-01-01

    The overlapping clinical and neuropathologic features between late-onset apparently sporadic Alzheimer's disease (LOAD), familial Alzheimer's disease (FAD), and other neurodegenerative dementias (frontotemporal dementia, corticobasal degeneration, progressive supranuclear palsy, and Creutzfeldt-Jakob disease) raise the question of whether shared genetic risk factors may explain the similar phenotype among these disparate disorders. To investigate this intriguing hypothesis, we analyzed rare coding variability in 6 Mendelian dementia genes (APP, PSEN1, PSEN2, GRN, MAPT, and PRNP), in 141 LOAD patients and 179 elderly controls, neuropathologically proven, from the UK. In our cohort, 14 LOAD cases (10%) and 11 controls (6%) carry at least 1 rare variant in the genes studied. We report a novel variant in PSEN1 (p.I168T) and a rare variant in PSEN2 (p.A237V), absent in controls and both likely pathogenic. Our findings support previous studies, suggesting that (1) rare coding variability in PSEN1 and PSEN2 may influence the susceptibility for LOAD and (2) GRN, MAPT, and PRNP are not major contributors to LOAD. Thus, genetic screening is pivotal for the clinical differential diagnosis of these neurodegenerative dementias. PMID:25104557

  9. Ethical coding.

    PubMed

    Resnik, Barry I

    2009-01-01

    It is ethical, legal, and proper for a dermatologist to maximize income through proper coding of patient encounters and procedures. The overzealous physician can misinterpret reimbursement requirements or receive bad advice from other physicians and cross the line from aggressive coding to coding fraud. Several of the more common problem areas are discussed.

  10. Investigation of Homologous Crossing over and Sister Chromatid Exchange in the Wheat Nor-B2 Locus Coding for Rrna and Gli-B2 Locus Coding for Gliadins

    PubMed Central

    Dvořák, J.; Appels, R.

    1986-01-01

    Recombination was investigated within the Nor-B2 locus of wheat chromosome 6B that contains several thousand of the 18S-5.8S-26S rRNA (rDNA) repeated units. Additionally, recombination was assessed for several chromosome regions, in arm 6Bq between the centromere and the B2 locus (awn suppressor) and in arm 6Bp between the centromere and Nor-B2, between Nor-B2 and a distal C-band and between Nor-B2 and Gli-B2 coding for gliadins. The experimental design permitted the distinction between crossing over between homologous chromosomes and exchange between sister chromatids. No homologous crossing over within the Nor-B2 locus was found in a sample of 446 chromosomes, but one exchange with the attributes of unequal sister chromatid exchange was identified. The molecular characteristics of this presumed sister chromatid exchange indicate that the spacer variants present in the Nor-B2 locus are clustered. No homologous recombination was detected within the distal Gli-B2 locus containing repeated genes coding for gliadin seed-storage proteins. Both arms of chromosome 6B showed low crossing-over frequency in the proximal regions. The distance from the centromere to Nor-B2 was only from 0.3 to 2.2 cM although it accounts for about two-thirds of the metaphase chromosome arm, which shows a great distortion of the metaphase map of the arm. The level of homologous recombination within the Nor-B2 locus is lower than in the chromosome region immediately distal to it. Whether it is comparable to that in the chromosome region proximal to it could not be determined. Recombination frequencies of different pairs of chromosome 6B in all but one interval paralleled the frequencies of their metaphase I pairing: Lower pairing at metaphase I was paralleled by lower crossing-over frequency. This relationship indicated that reduced metaphase I pairing between 6B chromosomes from different populations is due to impaired crossing-over and not due to precocious chiasma terminalization. PMID

  11. Independence of Internal Auditors.

    ERIC Educational Resources Information Center

    Montondon, Lucille; Meixner, Wilda F.

    1993-01-01

    A survey of 288 college and university auditors investigated patterns in their appointment, reporting, and supervisory practices as indicators of independence and objectivity. Results indicate a weakness in the positioning of internal auditing within institutions, possibly compromising auditor independence. Because the auditing function is…

  12. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  13. More box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    A new investigation shows that, starting from the BCH (21,15;3) code represented as a 7 x 3 matrix and adding a row and column to add even parity, one obtains an 8 x 4 matrix (32,15;8) code. An additional dimension is obtained by specifying odd parity on the rows and even parity on the columns, i.e., adjoining to the 8 x 4 matrix, the matrix, which is zero except for the fourth column (of all ones). Furthermore, any seven rows and three columns will form the BCH (21,15;3) code. This box code has the same weight structure as the quadratic residue and BCH codes of the same dimensions. Whether there exists an algebraic isomorphism to either code is as yet unknown.

  14. Independent Living.

    ERIC Educational Resources Information Center

    Nathanson, Jeanne H., Ed.

    1994-01-01

    This issue of "OSERS" addresses the subject of independent living of individuals with disabilities. The issue includes a message from Judith E. Heumann, the Assistant Secretary of the Office of Special Education and Rehabilitative Services (OSERS), and 10 papers. Papers have the following titles and authors: "Changes in the…

  15. Sharing code.

    PubMed

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  16. An extended version of the SERPENT-2 code to investigate fuel burn-up and core material evolution of the Molten Salt Fast Reactor

    NASA Astrophysics Data System (ADS)

    Aufiero, M.; Cammi, A.; Fiorina, C.; Leppänen, J.; Luzzi, L.; Ricotti, M. E.

    2013-10-01

    In this work, the Monte Carlo burn-up code SERPENT-2 has been extended and employed to study the material isotopic evolution of the Molten Salt Fast Reactor (MSFR). This promising GEN-IV nuclear reactor concept features peculiar characteristics such as the on-line fuel reprocessing, which prevents the use of commonly available burn-up codes. Besides, the presence of circulating nuclear fuel and radioactive streams from the core to the reprocessing plant requires a precise knowledge of the fuel isotopic composition during the plant operation. The developed extension of SERPENT-2 directly takes into account the effects of on-line fuel reprocessing on burn-up calculations and features a reactivity control algorithm. It is here assessed against a dedicated version of the deterministic ERANOS-based EQL3D procedure (PSI-Switzerland) and adopted to analyze the MSFR fuel salt isotopic evolution. Particular attention is devoted to study the effects of reprocessing time constants and efficiencies on the conversion ratio and the molar concentration of elements relevant for solubility issues (e.g., trivalent actinides and lanthanides). Quantities of interest for fuel handling and safety issues are investigated, including decay heat and activities of hazardous isotopes (neutron and high energy gamma emitters) in the core and in the reprocessing stream. The radiotoxicity generation is also analyzed for the MSFR nominal conditions. The production of helium and the depletion in tungsten content due to nuclear reactions are calculated for the nickel-based alloy selected as reactor structural material of the MSFR. These preliminary evaluations can be helpful in studying the radiation damage of both the primary salt container and the axial reflectors.

  17. Understanding independence

    NASA Astrophysics Data System (ADS)

    Annan, James; Hargreaves, Julia

    2016-04-01

    In order to perform any Bayesian processing of a model ensemble, we need a prior over the ensemble members. In the case of multimodel ensembles such as CMIP, the historical approach of ``model democracy'' (i.e. equal weight for all models in the sample) is no longer credible (if it ever was) due to model duplication and inbreeding. The question of ``model independence'' is central to the question of prior weights. However, although this question has been repeatedly raised, it has not yet been satisfactorily addressed. Here I will discuss the issue of independence and present a theoretical foundation for understanding and analysing the ensemble in this context. I will also present some simple examples showing how these ideas may be applied and developed.

  18. Phonological coding during reading

    PubMed Central

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  19. Investigation of plant control strategies for the supercritical C0{sub 2}Brayton cycle for a sodium-cooled fast reactor using the plant dynamics code.

    SciTech Connect

    Moisseytsev, A.; Sienicki, J.

    2011-04-12

    The development of a control strategy for the supercritical CO{sub 2} (S-CO{sub 2}) Brayton cycle has been extended to the investigation of alternate control strategies for a Sodium-Cooled Fast Reactor (SFR) nuclear power plant incorporating a S-CO{sub 2} Brayton cycle power converter. The SFR assumed is the 400 MWe (1000 MWt) ABR-1000 preconceptual design incorporating metallic fuel. Three alternative idealized schemes for controlling the reactor side of the plant in combination with the existing automatic control strategy for the S-CO{sub 2} Brayton cycle are explored using the ANL Plant Dynamics Code together with the SAS4A/SASSYS-1 Liquid Metal Reactor (LMR) Analysis Code System coupled together using the iterative coupling formulation previously developed and implemented into the Plant Dynamics Code. The first option assumes that the reactor side can be ideally controlled through movement of control rods and changing the speeds of both the primary and intermediate coolant system sodium pumps such that the intermediate sodium flow rate and inlet temperature to the sodium-to-CO{sub 2} heat exchanger (RHX) remain unvarying while the intermediate sodium outlet temperature changes as the load demand from the electric grid changes and the S-CO{sub 2} cycle conditions adjust according to the S-CO{sub 2} cycle control strategy. For this option, the reactor plant follows an assumed change in load demand from 100 to 0 % nominal at 5 % reduction per minute in a suitable fashion. The second option allows the reactor core power and primary and intermediate coolant system sodium pump flow rates to change autonomously in response to the strong reactivity feedbacks of the metallic fueled core and assumed constant pump torques representing unchanging output from the pump electric motors. The plant behavior to the assumed load demand reduction is surprising close to that calculated for the first option. The only negative result observed is a slight increase in the intermediate

  20. Accuracy of clinical coding for procedures in oral and maxillofacial surgery.

    PubMed

    Khurram, S A; Warner, C; Henry, A M; Kumar, A; Mohammed-Ali, R I

    2016-10-01

    Clinical coding has important financial implications, and discrepancies in the assigned codes can directly affect the funding of a department and hospital. Over the last few years, numerous oversights have been noticed in the coding of oral and maxillofacial (OMF) procedures. To establish the accuracy and completeness of coding, we retrospectively analysed the records of patients during two time periods: March to May 2009 (324 patients), and January to March 2014 (200 patients). Two investigators independently collected and analysed the data to ensure accuracy and remove bias. A large proportion of operations were not assigned all the relevant codes, and only 32% - 33% were correct in both cycles. To our knowledge, this is the first reported audit of clinical coding in OMFS, and it highlights serious shortcomings that have substantial financial implications. Better input by the surgical team and improved communication between the surgical and coding departments will improve accuracy.

  1. An investigative study of multispectral data compression for remotely-sensed images using vector quantization and difference-mapped shift-coding

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    A study is conducted to investigate the effects and advantages of data compression techniques on multispectral imagery data acquired by NASA's airborne scanners at the Stennis Space Center. The first technique used was vector quantization. The vector is defined in the multispectral imagery context as an array of pixels from the same location from each channel. The error obtained in substituting the reconstructed images for the original set is compared for different compression ratios. Also, the eigenvalues of the covariance matrix obtained from the reconstructed data set are compared with the eigenvalues of the original set. The effects of varying the size of the vector codebook on the quality of the compression and on subsequent classification are also presented. The output data from the Vector Quantization algorithm was further compressed by a lossless technique called Difference-mapped Shift-extended Huffman coding. The overall compression for 7 channels of data acquired by the Calibrated Airborne Multispectral Scanner (CAMS), with an RMS error of 15.8 pixels was 195:1 (0.41 bpp) and with an RMS error of 3.6 pixels was 18:1 (.447 bpp). The algorithms were implemented in software and interfaced with the help of dedicated image processing boards to an 80386 PC compatible computer. Modules were developed for the task of image compression and image analysis. Also, supporting software to perform image processing for visual display and interpretation of the compressed/classified images was developed.

  2. Sharing code

    PubMed Central

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing. PMID:25165519

  3. Suboptimum decoding of block codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao

    1991-01-01

    This paper investigates a class of decomposable codes, their distance and structural properties. it is shown that this class includes several classes of well known and efficient codes as subclasses. Several methods for constructing decomposable codes or decomposing codes are presented. A two-stage soft decision decoding scheme for decomposable codes, their translates or unions of translates is devised. This two-stage soft-decision decoding is suboptimum, and provides an excellent trade-off between the error performance and decoding complexity for codes of moderate and long block length.

  4. An Investigation into Reliability of Knee Extension Muscle Strength Measurements, and into the Relationship between Muscle Strength and Means of Independent Mobility in the Ward: Examinations of Patients Who Underwent Femoral Neck Fracture Surgery

    PubMed Central

    Katoh, Munenori; Kaneko, Yoshihiro

    2014-01-01

    [Purpose] The purpose of the present study was to investigate the reliability of isometric knee extension muscle strength measurement of patients who underwent femoral neck fracture surgery, as well as the relationship between independent mobility in the ward and knee muscle strength. [Subjects] The subjects were 75 patients who underwent femoral neck fracture surgery. [Methods] We used a hand-held dynamometer and a belt to measure isometric knee extension muscle strength three times, and used intraclass correlation coefficients (ICCs) to investigate the reliability of the measurements. We used a receiver operating characteristic curve to investigate the cutoff values for independent walking with walking sticks and non-independent mobility. [Results] ICCs (1, 1) were 0.9 or higher. The cutoff value for independent walking with walking sticks was 0.289 kgf/kg on the non-fractured side, 0.193 kgf/kg on the fractured side, and the average of both limbs was 0.238 kgf/kg. [Conclusion] We consider that the test-retest reliability of isometric knee extension muscle strength measurement of patients who have undergone femoral neck fracture surgery is high. We also consider that isometric knee extension muscle strength is useful for investigating means of independent mobility in the ward. PMID:24567667

  5. 'Independence' Panorama

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Click on the image for 'Independence' Panorama (QTVR)

    This is the Spirit 'Independence' panorama, acquired on martian days, or sols, 536 to 543 (July 6 to 13, 2005), from a position in the 'Columbia Hills' near the summit of 'Husband Hill.' The summit of 'Husband Hill' is the peak near the right side of this panorama and is about 100 meters (328 feet) away from the rover and about 30 meters (98 feet) higher in elevation. The rocky outcrops downhill and on the left side of this mosaic include 'Larry's Lookout' and 'Cumberland Ridge,' which Spirit explored in April, May, and June of 2005.

    The panorama spans 360 degrees and consists of 108 individual images, each acquired with five filters of the rover's panoramic camera. The approximate true color of the mosaic was generated using the camera's 750-, 530-, and 480-nanometer filters. During the 8 martian days, or sols, that it took to acquire this image, the lighting varied considerably, partly because of imaging at different times of sol, and partly because of small sol-to-sol variations in the dustiness of the atmosphere. These slight changes produced some image seams and rock shadows. These seams have been eliminated from the sky portion of the mosaic to better simulate the vista a person standing on Mars would see. However, it is often not possible or practical to smooth out such seams for regions of rock, soil, rover tracks or solar panels. Such is the nature of acquiring and assembling large panoramas from the rovers.

  6. Developing Research Skills: Independent Research Projects on Animals and Plants for Building the Research Skills of Report Writing, Mind Mapping, and Investigating through Inquiries. Revised Edition.

    ERIC Educational Resources Information Center

    Banks, Janet Caudill

    This book presents a collection of motivating, independent activities that involve animals and plants for use in developing the research skills of students in grades 2-6. Projects included in the book cover various levels of difficulty and are designed to promote higher-level thinking skills. Research components included in the activities in the…

  7. Investigation of the fluid-structure interaction of a high head Francis turbine using OpenFOAM and Code_Aster

    NASA Astrophysics Data System (ADS)

    Eichhorn, M.; Doujak, E.; Waldner, L.

    2016-11-01

    The increasing energy consumption and highly stressed power grids influence the operating conditions of turbines and pump turbines in the present situation. To provide or use energy as quick as possible, hydraulic turbines are operated more frequent and over longer periods of time in lower part load at off-design conditions. This leads to a more turbulent behavior and to higher requirements of the strength of stressed components (e.g. runner, guide or stay vanes). The modern advantages of computational capabilities regarding numerical investigations allow a precise prediction of appearing flow conditions and thereby induced strains in hydraulic machines. This paper focuses on the calculation of the unsteady pressure field of a high head Francis turbine with a specific speed of nq ≈ 24 min-1 and its impact on the structure at different operating conditions. In the first step, unsteady numerical flow simulations are performed with the open-source CFD software OpenFOAM. To obtain the appearing dynamic flow phenomena, the entire machine, consisting of the spiral casing, the stay vanes, the wicket gate, the runner and the draft tube, is taken into account. Additionally, a reduced model without the spiral casing and with a simplified inlet boundary is used. To evaluate the accuracy of the CFD simulations, operating parameters such as head and torque are compared with the results of site measurements carried out on the corresponding prototype machine. In the second part, the obtained pressure fields are used for a fluid-structure analysis with the open-source Finite Element software Code_Aster, to predict the static loads on the runner.

  8. Contemporary accuracy of death certificates for coding prostate cancer as a cause of death: Is reliance on death certification good enough? A comparison with blinded review by an independent cause of death evaluation committee

    PubMed Central

    Turner, Emma L; Metcalfe, Chris; Donovan, Jenny L; Noble, Sian; Sterne, Jonathan A C; Lane, J Athene; I Walsh, Eleanor; Hill, Elizabeth M; Down, Liz; Ben-Shlomo, Yoav; Oliver, Steven E; Evans, Simon; Brindle, Peter; Williams, Naomi J; Hughes, Laura J; Davies, Charlotte F; Ng, Siaw Yein; Neal, David E; Hamdy, Freddie C; Albertsen, Peter; Reid, Colette M; Oxley, Jon; McFarlane, John; Robinson, Mary C; Adolfsson, Jan; Zietman, Anthony; Baum, Michael; Koupparis, Anthony; Martin, Richard M

    2016-01-01

    Background: Accurate cause of death assignment is crucial for prostate cancer epidemiology and trials reporting prostate cancer-specific mortality outcomes. Methods: We compared death certificate information with independent cause of death evaluation by an expert committee within a prostate cancer trial (2002–2015). Results: Of 1236 deaths assessed, expert committee evaluation attributed 523 (42%) to prostate cancer, agreeing with death certificate cause of death in 1134 cases (92%, 95% CI: 90%, 93%). The sensitivity of death certificates in identifying prostate cancer deaths as classified by the committee was 91% (95% CI: 89%, 94%); specificity was 92% (95% CI: 90%, 94%). Sensitivity and specificity were lower where death occurred within 1 year of diagnosis, and where there was another primary cancer diagnosis. Conclusions: UK death certificates accurately identify cause of death in men with prostate cancer, supporting their use in routine statistics. Possible differential misattribution by trial arm supports independent evaluation in randomised trials. PMID:27253172

  9. How do we code the letters of a word when we have to write it? Investigating double letter representation in French.

    PubMed

    Kandel, Sonia; Peereman, Ronald; Ghimenton, Anna

    2014-05-01

    How do we code the letters of a word when we have to write it? We examined whether the orthographic representations that the writing system activates have a specific coding for letters when these are doubled in a word. French participants wrote words on a digitizer. The word pairs shared the initial letters and differed on the presence of a double letter (e.g., LISSER/LISTER). The results on latencies, letter and inter-letter interval durations revealed that L and I are slower to write when followed by a doublet (SS) than when not (ST). Doublet processing constitutes a supplementary cognitive load that delays word production. This suggests that word representations code letter identity and quantity separately. The data also revealed that the central processes that are involved in spelling representation cascade into the peripheral processes that regulate movement execution.

  10. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  11. Functional Investigation of a Non-coding Variant Associated with Adolescent Idiopathic Scoliosis in Zebrafish: Elevated Expression of the Ladybird Homeobox Gene Causes Body Axis Deformation.

    PubMed

    Guo, Long; Yamashita, Hiroshi; Kou, Ikuyo; Takimoto, Aki; Meguro-Horike, Makiko; Horike, Shin-ichi; Sakuma, Tetsushi; Miura, Shigenori; Adachi, Taiji; Yamamoto, Takashi; Ikegawa, Shiro; Hiraki, Yuji; Shukunami, Chisa

    2016-01-01

    Previously, we identified an adolescent idiopathic scoliosis susceptibility locus near human ladybird homeobox 1 (LBX1) and FLJ41350 by a genome-wide association study. Here, we characterized the associated non-coding variant and investigated the function of these genes. A chromosome conformation capture assay revealed that the genome region with the most significantly associated single nucleotide polymorphism (rs11190870) physically interacted with the promoter region of LBX1-FLJ41350. The promoter in the direction of LBX1, combined with a 590-bp region including rs11190870, had higher transcriptional activity with the risk allele than that with the non-risk allele in HEK 293T cells. The ubiquitous overexpression of human LBX1 or either of the zebrafish lbx genes (lbx1a, lbx1b, and lbx2), but not FLJ41350, in zebrafish embryos caused body curvature followed by death prior to vertebral column formation. Such body axis deformation was not observed in transcription activator-like effector nucleases mediated knockout zebrafish of lbx1b or lbx2. Mosaic expression of lbx1b driven by the GATA2 minimal promoter and the lbx1b enhancer in zebrafish significantly alleviated the embryonic lethal phenotype to allow observation of the later onset of the spinal curvature with or without vertebral malformation. Deformation of the embryonic body axis by lbx1b overexpression was associated with defects in convergent extension, which is a component of the main axis-elongation machinery in gastrulating embryos. In embryos overexpressing lbx1b, wnt5b, a ligand of the non-canonical Wnt/planar cell polarity (PCP) pathway, was significantly downregulated. Injection of mRNA for wnt5b or RhoA, a key downstream effector of Wnt/PCP signaling, rescued the defective convergent extension phenotype and attenuated the lbx1b-induced curvature of the body axis. Thus, our study presents a novel pathological feature of LBX1 and its zebrafish homologs in body axis deformation at various stages of

  12. Functional Investigation of a Non-coding Variant Associated with Adolescent Idiopathic Scoliosis in Zebrafish: Elevated Expression of the Ladybird Homeobox Gene Causes Body Axis Deformation

    PubMed Central

    Guo, Long; Yamashita, Hiroshi; Kou, Ikuyo; Takimoto, Aki; Meguro-Horike, Makiko; Horike, Shin-ichi; Sakuma, Tetsushi; Miura, Shigenori; Adachi, Taiji; Yamamoto, Takashi; Ikegawa, Shiro; Hiraki, Yuji; Shukunami, Chisa

    2016-01-01

    Previously, we identified an adolescent idiopathic scoliosis susceptibility locus near human ladybird homeobox 1 (LBX1) and FLJ41350 by a genome-wide association study. Here, we characterized the associated non-coding variant and investigated the function of these genes. A chromosome conformation capture assay revealed that the genome region with the most significantly associated single nucleotide polymorphism (rs11190870) physically interacted with the promoter region of LBX1-FLJ41350. The promoter in the direction of LBX1, combined with a 590-bp region including rs11190870, had higher transcriptional activity with the risk allele than that with the non-risk allele in HEK 293T cells. The ubiquitous overexpression of human LBX1 or either of the zebrafish lbx genes (lbx1a, lbx1b, and lbx2), but not FLJ41350, in zebrafish embryos caused body curvature followed by death prior to vertebral column formation. Such body axis deformation was not observed in transcription activator-like effector nucleases mediated knockout zebrafish of lbx1b or lbx2. Mosaic expression of lbx1b driven by the GATA2 minimal promoter and the lbx1b enhancer in zebrafish significantly alleviated the embryonic lethal phenotype to allow observation of the later onset of the spinal curvature with or without vertebral malformation. Deformation of the embryonic body axis by lbx1b overexpression was associated with defects in convergent extension, which is a component of the main axis-elongation machinery in gastrulating embryos. In embryos overexpressing lbx1b, wnt5b, a ligand of the non-canonical Wnt/planar cell polarity (PCP) pathway, was significantly downregulated. Injection of mRNA for wnt5b or RhoA, a key downstream effector of Wnt/PCP signaling, rescued the defective convergent extension phenotype and attenuated the lbx1b-induced curvature of the body axis. Thus, our study presents a novel pathological feature of LBX1 and its zebrafish homologs in body axis deformation at various stages of

  13. Coordinated design of coding and modulation systems

    NASA Technical Reports Server (NTRS)

    Massey, J. L.; Ancheta, T.; Johannesson, R.; Lauer, G.; Lee, L.

    1976-01-01

    The joint optimization of the coding and modulation systems employed in telemetry systems was investigated. Emphasis was placed on formulating inner and outer coding standards used by the Goddard Spaceflight Center. Convolutional codes were found that are nearly optimum for use with Viterbi decoding in the inner coding of concatenated coding systems. A convolutional code, the unit-memory code, was discovered and is ideal for inner system usage because of its byte-oriented structure. Simulations of sequential decoding on the deep-space channel were carried out to compare directly various convolutional codes that are proposed for use in deep-space systems.

  14. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Offense codes. 635.19 Section 635.19 National... INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The offense code describes, as nearly as possible, the complaint or offense by using an alphanumeric code. Appendix C of AR...

  15. Material-dependent and material-independent selection processes in the frontal and parietal lobes: an event-related fMRI investigation of response competition

    NASA Technical Reports Server (NTRS)

    Hazeltine, Eliot; Bunge, Silvia A.; Scanlon, Michael D.; Gabrieli, John D E.

    2003-01-01

    The present study used the flanker task [Percept. Psychophys. 16 (1974) 143] to identify neural structures that support response selection processes, and to determine which of these structures respond differently depending on the type of stimulus material associated with the response. Participants performed two versions of the flanker task while undergoing event-related functional magnetic resonance imaging (fMRI). Both versions of the task required participants to respond to a central stimulus regardless of the responses associated with simultaneously presented flanking stimuli, but one used colored circle stimuli and the other used letter stimuli. Competition-related activation was identified by comparing Incongruent trials, in which the flanker stimuli indicated a different response than the central stimulus, to Neutral stimuli, in which the flanker stimuli indicated no response. A region within the right inferior frontal gyrus exhibited significantly more competition-related activation for the color stimuli, whereas regions within the middle frontal gyri of both hemispheres exhibited more competition-related activation for the letter stimuli. The border of the right middle frontal and inferior frontal gyri and the anterior cingulate cortex (ACC) were significantly activated by competition for both types of stimulus materials. Posterior foci demonstrated a similar pattern: left inferior parietal cortex showed greater competition-related activation for the letters, whereas right parietal cortex was significantly activated by competition for both materials. These findings indicate that the resolution of response competition invokes both material-dependent and material-independent processes.

  16. Is ADHD a Risk Factor Independent of Conduct Disorder for Illicit Substance Use? A Meta-Analysis and Meta-Regression Investigation

    ERIC Educational Resources Information Center

    Serra-Pinheiro, Maria Antonia; Coutinho, Evandro S. F.; Souza, Isabella S.; Pinna, Camilla; Fortes, Didia; Araujo, Catia; Szobot, Claudia M.; Rohde, Luis A.; Mattos, Paulo

    2013-01-01

    Objective: To investigate meta-analytically if the association between ADHD and illicit substance use (ISU) is maintained when controlling for conduct disorder/oppositional-defiant disorder (CD/ODD). Method: A systematic literature review was conducted through Medline from 1980 to 2008. Data extracted and selections made by one author were…

  17. A Dynamic Population Model to Investigate Effects of Climate and Climate-Independent Factors on the Lifecycle of Amblyomma americanum (Acari: Ixodidae).

    PubMed

    Ludwig, Antoinette; Ginsberg, Howard S; Hickling, Graham J; Ogden, Nicholas H

    2016-01-01

    The lone star tick, Amblyomma americanum, is a disease vector of significance for human and animal health throughout much of the eastern United States. To model the potential effects of climate change on this tick, a better understanding is needed of the relative roles of temperature-dependent and temperature-independent (day-length-dependent behavioral or morphogenetic diapause) processes acting on the tick lifecycle. In this study, we explored the roles of these processes by simulating seasonal activity patterns using models with site-specific temperature and day-length-dependent processes. We first modeled the transitions from engorged larvae to feeding nymphs, engorged nymphs to feeding adults, and engorged adult females to feeding larvae. The simulated seasonal patterns were compared against field observations at three locations in United States. Simulations suggested that 1) during the larva-to-nymph transition, some larvae undergo no diapause while others undergo morphogenetic diapause of engorged larvae; 2) molted adults undergo behavioral diapause during the transition from nymph-to-adult; and 3) there is no diapause during the adult-to-larva transition. A model constructed to simulate the full lifecycle of A. americanum successfully predicted observed tick activity at the three U.S. study locations. Some differences between observed and simulated seasonality patterns were observed, however, identifying the need for research to refine some model parameters. In simulations run using temperature data for Montreal, deterministic die-out of A. americanum populations did not occur, suggesting the possibility that current climate in parts of southern Canada is suitable for survival and reproduction of this tick.

  18. A dynamic population model to investigate effects of climate and climate-independent factors on the lifecycle of the tick Amblyomma americanum (Acari: Ixodidae)

    USGS Publications Warehouse

    Ludwig, Antoinette; Ginsberg, Howard; Hickling, Graham J.; Ogden, Nicholas H.

    2016-01-01

    The lone star tick, Amblyomma americanum, is a disease vector of significance for human and animal health throughout much of the eastern United States. To model the potential effects of climate change on this tick, a better understanding is needed of the relative roles of temperature-dependent and temperature-independent (day-length-dependent behavioral or morphogenetic diapause) processes acting on the tick lifecycle. In this study, we explored the roles of these processes by simulating seasonal activity patterns using models with site-specific temperature and day-length-dependent processes. We first modeled the transitions from engorged larvae to feeding nymphs, engorged nymphs to feeding adults, and engorged adult females to feeding larvae. The simulated seasonal patterns were compared against field observations at three locations in United States. Simulations suggested that 1) during the larva-to-nymph transition, some larvae undergo no diapause while others undergo morphogenetic diapause of engorged larvae; 2) molted adults undergo behavioral diapause during the transition from nymph-to-adult; and 3) there is no diapause during the adult-to-larva transition. A model constructed to simulate the full lifecycle of A. americanum successfully predicted observed tick activity at the three U.S. study locations. Some differences between observed and simulated seasonality patterns were observed, however, identifying the need for research to refine some model parameters. In simulations run using temperature data for Montreal, deterministic die-out of A. americanum populations did not occur, suggesting the possibility that current climate in parts of southern Canada is suitable for survival and reproduction of this tick.

  19. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  20. Implementation issues in source coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.

    1989-01-01

    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

  1. On multilevel block modulation codes

    NASA Technical Reports Server (NTRS)

    Kasami, Tadao; Takata, Toyoo; Fujiwara, Toru; Lin, Shu

    1991-01-01

    The multilevel (ML) technique for combining block coding and modulation is investigated. A general formulation is presented for ML modulation codes in terms of component codes with appropriate distance measures. A specific method for constructing ML block modulation codes (MLBMCs) with interdependency among component codes is proposed. Given an MLBMC C with no interdependency among the binary component codes, the proposed method gives an MLBC C-prime that has the same rate as C, a minimum squared Euclidean distance not less than that of C, a trellis diagram with the same number of states as that of C, and a smaller number of nearest-neighbor codewords than that of C. Finally, a technique is presented for analyzing the error performance of MLBMCs for an additive white Gaussian noise channel based on soft-decision maximum-likelihood decoding.

  2. Experimental investigation of neutronic characteristics of the IR-8 reactor to confirm the results of calculations by MCU-PTR code

    SciTech Connect

    Surkov, A. V. Kochkin, V. N.; Pesnya, Yu. E.; Nasonov, V. A.; Vihrov, V. I.; Erak, D. Yu.

    2015-12-15

    A comparison of measured and calculated neutronic characteristics (fast neutron flux and fission rate of {sup 235}U) in the core and reflector of the IR-8 reactor is presented. The irradiation devices equipped with neutron activation detectors were prepared. The determination of fast neutron flux was performed using the {sup 54}Fe (n, p) and {sup 58}Ni (n, p) reactions. The {sup 235}U fission rate was measured using uranium dioxide with 10% enrichment in {sup 235}U. The determination of specific activities of detectors was carried out by measuring the intensity of characteristic gamma peaks using the ORTEC gamma spectrometer. Neutron fields in the core and reflector of the IR-8 reactor were calculated using the MCU-PTR code.

  3. Cell-free synthesis of functional human epidermal growth factor receptor: Investigation of ligand-independent dimerization in Sf21 microsomal membranes using non-canonical amino acids

    PubMed Central

    Quast, Robert B.; Ballion, Biljana; Stech, Marlitt; Sonnabend, Andrei; Varga, Balázs R.; Wüstenhagen, Doreen A.; Kele, Péter; Schiller, Stefan M.; Kubick, Stefan

    2016-01-01

    Cell-free protein synthesis systems represent versatile tools for the synthesis and modification of human membrane proteins. In particular, eukaryotic cell-free systems provide a promising platform for their structural and functional characterization. Here, we present the cell-free synthesis of functional human epidermal growth factor receptor and its vIII deletion mutant in a microsome-containing system derived from cultured Sf21 cells. We provide evidence for embedment of cell-free synthesized receptors into microsomal membranes and asparagine-linked glycosylation. Using the cricket paralysis virus internal ribosome entry site and a repetitive synthesis approach enrichment of receptors inside the microsomal fractions was facilitated thereby providing analytical amounts of functional protein. Receptor tyrosine kinase activation was demonstrated by monitoring receptor phosphorylation. Furthermore, an orthogonal cell-free translation system that provides the site-directed incorporation of p-azido-L-phenylalanine is characterized and applied to investigate receptor dimerization in the absence of a ligand by photo-affinity cross-linking. Finally, incorporated azides are used to generate stable covalently linked receptor dimers by strain-promoted cycloaddition using a novel linker system. PMID:27670253

  4. The Integrated TIGER Series Codes

    SciTech Connect

    Kensek, Ronald P.; Franke, Brian C.; Laub, Thomas W.

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  5. Medical coding in clinical trials.

    PubMed

    Babre, Deven

    2010-01-01

    Data generated in all clinical trial are recorded on the data collection instrument Case report Form / Electronic Case Report Form by investigators located at various sites in various countries. In multicentric clinical trials since different investigator or medically qualified experts are from different sites / centers recording the medical term(s) uniformly is a big challenge. Medical coders from clinical data management team process these terms and perform medical coding. Medical coding is performed to categorize the medical terms reported appropriately so that they can be analyzed/reviewed. This article describes process which is used for medical coding in clinical data management and two most commonly used medical dictionaries MedDRA and WHO-DDE in brief. It is expected to help medical coders to understand the process of medical coding in clinical data management. Few common issues which the medical coder faces while performing medical coding, are also highlighted.

  6. Nevada Nuclear Waste Storage Investigations Project: Unit evaluation at Yucca Mountain, Nevada Test Site: Near-field thermal and mechanical calculations using the SANDIA-ADINA code

    SciTech Connect

    Johnson, R.L.; Bauer, S.J.

    1987-05-01

    Presented in this report are the results of a comparative study of two candidate horizons, the welded, devitrified Topopah Spring Member ofthe Paintbrush Tuff, and the nonwelded, zeolitized Tuffaceous Beds of Calico Hills. The mechanical and thermomechanical response these two horizons was assessed by conducting thermal and thermomechanical calculations using a two-dimensional room and pillar geometry of the vertical waste emplacement option using average and limit properties for each. A modified version of the computer code ADINA (SANDIA-ADINA) containing a material model for rock masses with ubiquitous jointing was used in the calculations. Results of the calculations are presented as the units` capacity for storage of nuclear waste and stability of the emplacement room and pillar due to excavation and long-term heating. A comparison is made with a similar underground opening geometry sited in Grouse Canyon Tuff, using properties obtained from G-Tunnel - a horizon of known excavation characteristics. Long-term stability of the excavated rooms was predicted for all units, as determined by evaluating regions of predicted joint slip as the result of excavation and subsequent thermal loading, evaluating regions of predicted rock matrix failure as the result of excavation and subsequent thermal loading, and evaluating safety factors against rock matrix failure. These results were derived through considering a wide range in material properties and in situ stresses. 21 refs., 21 figs., 5 tabs.

  7. Investigation on iterative multiuser detection physical layer network coding in two-way relay free-space optical links with turbulences and pointing errors.

    PubMed

    Abu-Almaalie, Zina; Ghassemlooy, Zabih; Bhatnagar, Manav R; Le-Minh, Hoa; Aslam, Nauman; Liaw, Shien-Kuei; Lee, It Ee

    2016-11-20

    Physical layer network coding (PNC) improves the throughput in wireless networks by enabling two nodes to exchange information using a minimum number of time slots. The PNC technique is proposed for two-way relay channel free space optical (TWR-FSO) communications with the aim of maximizing the utilization of network resources. The multipair TWR-FSO is considered in this paper, where a single antenna on each pair seeks to communicate via a common receiver aperture at the relay. Therefore, chip interleaving is adopted as a technique to separate the different transmitted signals at the relay node to perform PNC mapping. Accordingly, this scheme relies on the iterative multiuser technique for detection of users at the receiver. The bit error rate (BER) performance of the proposed system is examined under the combined influences of atmospheric loss, turbulence-induced channel fading, and pointing errors (PEs). By adopting the joint PNC mapping with interleaving and multiuser detection techniques, the BER results show that the proposed scheme can achieve a significant performance improvement against the degrading effects of turbulences and PEs. It is also demonstrated that a larger number of simultaneous users can be supported with this new scheme in establishing a communication link between multiple pairs of nodes in two time slots, thereby improving the channel capacity.

  8. Performance of convolutionally coded unbalanced QPSK systems

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Yuen, J. H.

    1980-01-01

    An evaluation is presented of the performance of three representative convolutionally coded unbalanced quadri-phase-shift-keying (UQPSK) systems in the presence of noisy carrier reference and crosstalk. The use of a coded UQPSK system for transmitting two telemetry data streams with different rates and different powers has been proposed for the Venus Orbiting Imaging Radar mission. Analytical expressions for bit error rates in the presence of a noisy carrier phase reference are derived for three representative cases: (1) I and Q channels are coded independently; (2) I channel is coded, Q channel is uncoded; and (3) I and Q channels are coded by a common 1/2 code. For rate 1/2 convolutional codes, QPSK modulation can be used to reduce the bandwidth requirement.

  9. Role of long non-coding RNA HULC in cell proliferation, apoptosis and tumor metastasis of gastric cancer: a clinical and in vitro investigation.

    PubMed

    Zhao, Yan; Guo, Qinhao; Chen, Jiejing; Hu, Jun; Wang, Shuwei; Sun, Yueming

    2014-01-01

    Long non-coding RNAs (lncRNAs) are emerging as key molecules in human cancer. Highly upregulated in liver cancer (HULC), an lncRNA, has recently been revealed to be involved in hepatocellular carcinoma development and progression. It remains unclear, however, whether HULC plays an oncogenic role in human gastric cancer (GC). In the present study, we demonstrated that HULC was significantly overexpressed in GC cell lines and GC tissues compared with normal controls, and this overexpression was correlated with lymph node metastasis, distant metastasis and advanced tumor node metastasis stages. In addition, a receiver operating characteristic (ROC) curve was constructed to evaluate the diagnostic values and the area under the ROC curve of HULC was up to 0.769. To uncover its functional importance, gain- and loss-of-function studies were performed to evaluate the effect of HULC on cell proliferation, apoptosis and invasion in vitro. Overexpression of HULC promoted proliferation and invasion and inhibited cell apoptosis in SGC7901 cells, while knockdown of HULC in SGC7901 cells showed the opposite effect. Mechanistically, we discovered that overexpression of HULC could induce patterns of autophagy in SGC7901 cells; more importantly, autophagy inhibition increased overexpression of HULC cell apoptosis. We also determined that silencing of HULC effectively reversed the epithelial-to-mesenchymal transition (EMT) phenotype. In summary, our results suggest that HULC may play an important role in the growth and tumorigenesis of human GC, which provides us with a new biomarker in GC and perhaps a potential target for GC prevention, diagnosis and therapeutic treatment.

  10. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    PubMed

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control.

  11. Independent Peer Reviews

    SciTech Connect

    2012-03-16

    Independent Assessments: DOE's Systems Integrator convenes independent technical reviews to gauge progress toward meeting specific technical targets and to provide technical information necessary for key decisions.

  12. Covariance Matrix Evaluations for Independent Mass Fission Yields

    SciTech Connect

    Terranova, N.; Serot, O.; Archier, P.; De Saint Jean, C.

    2015-01-15

    Recent needs for more accurate fission product yields include covariance information to allow improved uncertainty estimations of the parameters used by design codes. The aim of this work is to investigate the possibility to generate more reliable and complete uncertainty information on independent mass fission yields. Mass yields covariances are estimated through a convolution between the multi-Gaussian empirical model based on Brosa's fission modes, which describe the pre-neutron mass yields, and the average prompt neutron multiplicity curve. The covariance generation task has been approached using the Bayesian generalized least squared method through the CONRAD code. Preliminary results on mass yields variance-covariance matrix will be presented and discussed from physical grounds in the case of {sup 235}U(n{sub th}, f) and {sup 239}Pu(n{sub th}, f) reactions.

  13. SU-E-T-521: Investigation of the Uncertainties Involved in Secondary Neutron/gamma Production in Geant4/MCNP6 Monte Carlo Codes for Proton Therapy Application

    SciTech Connect

    Mirzakhanian, L; Enger, S; Giusti, V

    2015-06-15

    Purpose: A major concern in proton therapy is the production of secondary neutrons causing secondary cancers, especially in young adults and children. Most utilized Monte Carlo codes in proton therapy are Geant4 and MCNP. However, the default versions of Geant4 and MCNP6 do not have suitable cross sections or physical models to properly handle secondary particle production in proton energy ranges used for therapy. In this study, default versions of Geant4 and MCNP6 were modified to better handle production of secondaries by adding the TENDL-2012 cross-section library. Methods: In-water proton depth-dose was measured at the “The Svedberg Laboratory” in Uppsala (Sweden). The proton beam was mono-energetic with mean energy of 178.25±0.2 MeV. The measurement set-up was simulated by Geant4 version 10.00 (default and modified version) and MCNP6. Proton depth-dose, primary and secondary particle fluence and neutron equivalent dose were calculated. In case of Geant4, the secondary particle fluence was filtered by all the physics processes to identify the main process responsible for the difference between the default and modified version. Results: The proton depth-dose curves and primary proton fluence show a good agreement between both Geant4 versions and MCNP6. With respect to the modified version, default Geant4 underestimates the production of secondary neutrons while overestimates that of gammas. The “ProtonInElastic” process was identified as the main responsible process for the difference between the two versions. MCNP6 shows higher neutron production and lower gamma production than both Geant4 versions. Conclusion: Despite the good agreement on the proton depth dose curve and primary proton fluence, there is a significant discrepancy on secondary neutron production between MCNP6 and both versions of Geant4. Further studies are thus in order to find the possible cause of this discrepancy or more accurate cross-sections/models to handle the nuclear

  14. Diagnostic Coding for Epilepsy.

    PubMed

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  15. Model Children's Code.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  16. Phylogeny of genetic codes and punctuation codes within genetic codes.

    PubMed

    Seligmann, Hervé

    2015-03-01

    Punctuation codons (starts, stops) delimit genes, reflect translation apparatus properties. Most codon reassignments involve punctuation. Here two complementary approaches classify natural genetic codes: (A) properties of amino acids assigned to codons (classical phylogeny), coding stops as X (A1, antitermination/suppressor tRNAs insert unknown residues), or as gaps (A2, no translation, classical stop); and (B) considering only punctuation status (start, stop and other codons coded as -1, 0 and 1 (B1); 0, -1 and 1 (B2, reflects ribosomal translational dynamics); and 1, -1, and 0 (B3, starts/stops as opposites)). All methods separate most mitochondrial codes from most nuclear codes; Gracilibacteria consistently cluster with metazoan mitochondria; mitochondria co-hosted with chloroplasts cluster with nuclear codes. Method A1 clusters the euplotid nuclear code with metazoan mitochondria; A2 separates euplotids from mitochondria. Firmicute bacteria Mycoplasma/Spiroplasma and Protozoan (and lower metazoan) mitochondria share codon-amino acid assignments. A1 clusters them with mitochondria, they cluster with the standard genetic code under A2: constraints on amino acid ambiguity versus punctuation-signaling produced the mitochondrial versus bacterial versions of this genetic code. Punctuation analysis B2 converges best with classical phylogenetic analyses, stressing the need for a unified theory of genetic code punctuation accounting for ribosomal constraints.

  17. Coded Modulations for Mobile Satellite Communication Channels

    NASA Astrophysics Data System (ADS)

    Rhee, Dojun

    1995-01-01

    The mobile satellite (MSAT) channel is subject to multipath fading, shadowing, Doppler frequency shift, and adjacent channel interference (ACI). Therefore, transmitted signals face severe amplitude and phase distortions. This dissertation investigates various high performance and low decoding complexity coded modulation schemes for reliable voice and data transmissions over the shadowed mobile satellite channel and the Rayleigh fading channel. The dissertation consists of four parts. The first part presents a systematic technique for constructing MPSK trellis coded modulation (TCM) codes for voice transmission over the MSAT channel. The multilevel coding method is used for constructing TCM codes using convolutional codes with good free branch distances as the component codes or using both convolutional and block codes as the component codes. Simulation results show that these codes achieve good coding gains over the uncoded reference system and outperform existing TCM codes with the same decoding complexity. In the second part, using the multilevel coding method, multilevel block coded modulation (BCM) codes are constructed for voice transmission over the MSAT channel. Even though BCM is generally less power efficient than TCM for AWGN channels, BCM has a great potential to compete with TCM in the MSAT channel because of its shorter decoding depth and hence more effective interleaving. Binary Reed -Muller (RM) codes of length up to 32 are used as component codes. Simulation results show that these codes achieve good coding gains over the uncoded reference system and outperform TCM codes with the same decoding complexity. In the third part, a simple and systematic technique for constructing multilevel concatenated BCM schemes for data transmission over the shadowed MSAT channel and the Rayleigh fading channel is presented. These schemes are designed to achieve high-performance or large coding gain with reduced decoding complexity. Construction is based on a

  18. Temporal Codes for Memories: Issues and Problems. Technical Report.

    ERIC Educational Resources Information Center

    Underwood, Benton J.

    Little is known about the accuracy of temporal codes for memories, and still less about the nature of the codes. This report addresses the central question of the mechanisms by which order information is attached to memories. The results of 16 experiments indicate the role of some independent variables on temporal coding in relatively short-term…

  19. Dual-code quantum computation model

    NASA Astrophysics Data System (ADS)

    Choi, Byung-Soo

    2015-08-01

    In this work, we propose the dual-code quantum computation model—a fault-tolerant quantum computation scheme which alternates between two different quantum error-correction codes. Since the chosen two codes have different sets of transversal gates, we can implement a universal set of gates transversally, thereby reducing the overall cost. We use code teleportation to convert between quantum states in different codes. The overall cost is decreased if code teleportation requires fewer resources than the fault-tolerant implementation of the non-transversal gate in a specific code. To analyze the cost reduction, we investigate two cases with different base codes, namely the Steane and Bacon-Shor codes. For the Steane code, neither the proposed dual-code model nor another variation of it achieves any cost reduction since the conventional approach is simple. For the Bacon-Shor code, the three proposed variations of the dual-code model reduce the overall cost. However, as the encoding level increases, the cost reduction decreases and becomes negative. Therefore, the proposed dual-code model is advantageous only when the encoding level is low and the cost of the non-transversal gate is relatively high.

  20. The Proteomic Code: a molecular recognition code for proteins

    PubMed Central

    Biro, Jan C

    2007-01-01

    Background The Proteomic Code is a set of rules by which information in genetic material is transferred into the physico-chemical properties of amino acids. It determines how individual amino acids interact with each other during folding and in specific protein-protein interactions. The Proteomic Code is part of the redundant Genetic Code. Review The 25-year-old history of this concept is reviewed from the first independent suggestions by Biro and Mekler, through the works of Blalock, Root-Bernstein, Siemion, Miller and others, followed by the discovery of a Common Periodic Table of Codons and Nucleic Acids in 2003 and culminating in the recent conceptualization of partial complementary coding of interacting amino acids as well as the theory of the nucleic acid-assisted protein folding. Methods and conclusions A novel cloning method for the design and production of specific, high-affinity-reacting proteins (SHARP) is presented. This method is based on the concept of proteomic codes and is suitable for large-scale, industrial production of specifically interacting peptides. PMID:17999762

  1. Investigating the Influence of the Added Mass Effect to Marine Hydrokinetic Horizontal-Axis Turbines Using a General Dynamic Wake Wind Turbine Code: Preprint

    SciTech Connect

    Maniaci, D. C.; Li, Y.

    2012-04-01

    This paper describes a recent study to investigate the applicability of a horizontal-axis wind turbine (HAWT) structural dynamics and unsteady aerodynamics analysis program (FAST and AeroDyn respectively) to modeling the forces on marine hydrokinetic (MHK) turbines. It summarizes the added mass model that has been added to AeroDyn. The added mass model only includes flow acceleration perpendicular to the rotor disc, and ignores added mass forces caused by blade deflection. A model of the National Renewable Energy Laboratory's (NREL) Unsteady Aerodynamics Experiment (UAE) Phase VI wind turbine was analyzed using FAST and AeroDyn with sea water conditions and the new added mass model. The results of this analysis exhibited a 3.6% change in thrust for a rapid pitch case and a slight change in amplitude and phase of thrust for a case with 30 degrees of yaw.

  2. Investigating the Influence of the Added Mass Effect to Marine Hydrokinetic Horizontal-Axis Turbines Using a General Dynamic Wake Wind Turbine Code

    SciTech Connect

    Maniaci, D. C.; Li, Y.

    2011-10-01

    This paper describes a recent study to investigate the applicability of a horizontal-axis wind turbine (HAWT) structural dynamics and unsteady aerodynamics analysis program (FAST and AeroDyn respectively) to modeling the forces on marine hydrokinetic (MHK) turbines. This paper summarizes the added mass model that has been added to AeroDyn. The added mass model only includes flow acceleration perpendicular to the rotor disc, and ignores added mass forces caused by blade deflection. A model of the National Renewable Energy Laboratory's (NREL) Unsteady Aerodynamics Experiment (UAE) Phase VI wind turbine was analyzed using FAST and AeroDyn with sea water conditions and the new added mass model. The results of this analysis exhibited a 3.6% change in thrust for a rapid pitch case and a slight change in amplitude and phase of thrust for a case with 30{sup o} of yaw.

  3. Non-White, No More: Effect Coding as an Alternative to Dummy Coding with Implications for Higher Education Researchers

    ERIC Educational Resources Information Center

    Mayhew, Matthew J.; Simonoff, Jeffrey S.

    2015-01-01

    The purpose of this article is to describe effect coding as an alternative quantitative practice for analyzing and interpreting categorical, race-based independent variables in higher education research. Unlike indicator (dummy) codes that imply that one group will be a reference group, effect codes use average responses as a means for…

  4. Energy efficient rateless codes for high speed data transfer over free space optical channels

    NASA Astrophysics Data System (ADS)

    Prakash, Geetha; Kulkarni, Muralidhar; Acharya, U. S.

    2015-03-01

    Terrestrial Free Space Optical (FSO) links transmit information by using the atmosphere (free space) as a medium. In this paper, we have investigated the use of Luby Transform (LT) codes as a means to mitigate the effects of data corruption induced by imperfect channel which usually takes the form of lost or corrupted packets. LT codes, which are a class of Fountain codes, can be used independent of the channel rate and as many code words as required can be generated to recover all the message bits irrespective of the channel performance. Achieving error free high data rates with limited energy resources is possible with FSO systems if error correction codes with minimal overheads on the power can be used. We also employ a combination of Binary Phase Shift Keying (BPSK) with provision for modification of threshold and optimized LT codes with belief propagation for decoding. These techniques provide additional protection even under strong turbulence regimes. Automatic Repeat Request (ARQ) is another method of improving link reliability. Performance of ARQ is limited by the number of retransmissions and the corresponding time delay. We prove through theoretical computations and simulations that LT codes consume less energy per bit. We validate the feasibility of using energy efficient LT codes over ARQ for FSO links to be used in optical wireless sensor networks within the eye safety limits.

  5. AEST: Adaptive Eigenvalue Stability Code

    NASA Astrophysics Data System (ADS)

    Zheng, L.-J.; Kotschenreuther, M.; Waelbroeck, F.; van Dam, J. W.; Berk, H.

    2002-11-01

    An adaptive eigenvalue linear stability code is developed. The aim is on one hand to include the non-ideal MHD effects into the global MHD stability calculation for both low and high n modes and on the other hand to resolve the numerical difficulty involving MHD singularity on the rational surfaces at the marginal stability. Our code follows some parts of philosophy of DCON by abandoning relaxation methods based on radial finite element expansion in favor of an efficient shooting procedure with adaptive gridding. The δ W criterion is replaced by the shooting procedure and subsequent matrix eigenvalue problem. Since the technique of expanding a general solution into a summation of the independent solutions employed, the rank of the matrices involved is just a few hundreds. This makes easier to solve the eigenvalue problem with non-ideal MHD effects, such as FLR or even full kinetic effects, as well as plasma rotation effect, taken into account. To include kinetic effects, the approach of solving for the distribution function as a local eigenvalue ω problem as in the GS2 code will be employed in the future. Comparison of the ideal MHD version of the code with DCON, PEST, and GATO will be discussed. The non-ideal MHD version of the code will be employed to study as an application the transport barrier physics in tokamak discharges.

  6. Accumulate repeat accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

  7. Concatenated Coding Using Trellis-Coded Modulation

    NASA Technical Reports Server (NTRS)

    Thompson, Michael W.

    1997-01-01

    In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.

  8. Coset Codes Viewed as Terminated Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1996-01-01

    In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.

  9. V(D)J recombination coding junction formation without DNA homology: processing of coding termini.

    PubMed Central

    Boubnov, N V; Wills, Z P; Weaver, D T

    1993-01-01

    Coding junction formation in V(D)J recombination generates diversity in the antigen recognition structures of immunoglobulin and T-cell receptor molecules by combining processes of deletion of terminal coding sequences and addition of nucleotides prior to joining. We have examined the role of coding end DNA composition in junction formation with plasmid substrates containing defined homopolymers flanking the recombination signal sequence elements. We found that coding junctions formed efficiently with or without terminal DNA homology. The extent of junctional deletion was conserved independent of coding ends with increased, partial, or no DNA homology. Interestingly, G/C homopolymer coding ends showed reduced deletion regardless of DNA homology. Therefore, DNA homology cannot be the primary determinant that stabilizes coding end structures for processing and joining. PMID:8413286

  10. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  11. Independent Study in Iowa.

    ERIC Educational Resources Information Center

    Idaho Univ., Moscow.

    This guide to independent study in Idaho begins with introductory information on the following aspects of independent study: the Independent Study in Idaho consortium, student eligibility, special needs, starting dates, registration, costs, textbooks and instructional materials, e-mail and faxing, refunds, choosing a course, time limits, speed…

  12. A new class of polyphase pulse compression codes

    NASA Astrophysics Data System (ADS)

    Deng, Hai; Lin, Maoyong

    The study presents the synthesis method of a new class of polyphase pulse compression codes - NLFM code, and investigates the properties of this code. The NLFM code, which is derived from sampling and quantization of a nonlinear FM waveform, features a low-range sidelobe and insensitivity to Doppler effect. Simulation results show that the major properties of the NLFM polyphase code are superior to the Frank code.

  13. Bar Codes for Libraries.

    ERIC Educational Resources Information Center

    Rahn, Erwin

    1984-01-01

    Discusses the evolution of standards for bar codes (series of printed lines and spaces that represent numbers, symbols, and/or letters of alphabet) and describes the two types most frequently adopted by libraries--Code-A-Bar and CODE 39. Format of the codes is illustrated. Six references and definitions of terminology are appended. (EJS)

  14. Manually operated coded switch

    DOEpatents

    Barnette, Jon H.

    1978-01-01

    The disclosure relates to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made.

  15. CONTAIN independent peer review

    SciTech Connect

    Boyack, B.E.; Corradini, M.L.; Denning, R.S.; Khatib-Rahbar, M.; Loyalka, S.K.; Smith, P.N.

    1995-01-01

    The CONTAIN code was developed by Sandia National Laboratories under the sponsorship of the US Nuclear Regulatory Commission (NRC) to provide integrated analyses of containment phenomena. It is used to predict nuclear reactor containment loads, radiological source terms, and associated physical phenomena for a range of accident conditions encompassing both design-basis and severe accidents. The code`s targeted applications include support for containment-related experimental programs, light water and advanced light water reactor plant analysis, and analytical support for resolution of specific technical issues such as direct containment heating. The NRC decided that a broad technical review of the code should be performed by technical experts to determine its overall technical adequacy. For this purpose, a six-member CONTAIN Peer Review Committee was organized and a peer review as conducted. While the review was in progress, the NRC issued a draft ``Revised Severe Accident Code Strategy`` that incorporated revised design objectives and targeted applications for the CONTAIN code. The committee continued its effort to develop findings relative to the original NRC statement of design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications were considered by the Committee in assigning priorities to the Committee`s recommendations. The Committee determined some improvements are warranted and provided recommendations in five code-related areas: (1) documentation, (2) user guidance, (3) modeling capability, (4) code assessment, and (5) technical assessment.

  16. An introduction to QR Codes: linking libraries and mobile patrons.

    PubMed

    Hoy, Matthew B

    2011-01-01

    QR codes, or "Quick Response" codes, are two-dimensional barcodes that can be scanned by mobile smartphone cameras. These codes can be used to provide fast access to URLs, telephone numbers, and short passages of text. With the rapid adoption of smartphones, librarians are able to use QR codes to promote services and help library users find materials quickly and independently. This article will explain what QR codes are, discuss how they can be used in the library, and describe issues surrounding their use. A list of resources for generating and scanning QR codes is also provided.

  17. Transionospheric Propagation Code (TIPC)

    SciTech Connect

    Roussel-Dupre, R.; Kelley, T.A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of vhf signals following propagation through the ionosphere. The code is written in Fortran 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, DTOA study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of delta-times-of-arrival (DTOAs) vs TECs for a specified pair of receivers.

  18. QR Codes 101

    ERIC Educational Resources Information Center

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  19. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  20. Pulsed Inductive Thruster (PIT): Modeling and Validation Using the MACH2 Code

    NASA Technical Reports Server (NTRS)

    Schneider, Steven (Technical Monitor); Mikellides, Pavlos G.

    2003-01-01

    Numerical modeling of the Pulsed Inductive Thruster exercising the magnetohydrodynamics code, MACH2 aims to provide bilateral validation of the thruster's measured performance and the code's capability of capturing the pertinent physical processes. Computed impulse values for helium and argon propellants demonstrate excellent correlation to the experimental data for a range of energy levels and propellant-mass values. The effects of the vacuum tank wall and massinjection scheme were investigated to show trivial changes in the overall performance. An idealized model for these energy levels and propellants deduces that the energy expended to the internal energy modes and plasma dissipation processes is independent of the propellant type, mass, and energy level.

  1. Cross-over component code construction for multi-level block modulation codes

    NASA Technical Reports Server (NTRS)

    Kasami, Tadao; Takata, Toyoo; Fujiwara, Toru; Lin, Shu

    1990-01-01

    This paper investigates the multilevel technique for combining block coding and modulation. Several specific methods for constructing multilevel block modulation codes with interdependency among component codes are presented. Given a multilevel block modulation code C with no interdependency among the binary component codes, the proposed methods give a multilevel block modulation code C-prime which has the same rate as C, a minimum squared Euclidean distance not less than that of C, a trellis diagram with the same number of states as that of C, and a smaller number of nearest neighbor codewords than that of C.

  2. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…

  3. An Approach to Keeping Independent Colleges Independent.

    ERIC Educational Resources Information Center

    Northwest Area Foundation, St. Paul, Minn.

    As a result of the financial difficulties faced by independent colleges in the northwestern United States, the Northwest Area Foundation in 1972 surveyed the administrations of 80 private colleges to get a profile of the colleges, a list of their current problems, and some indication of how the problems might be approached. The three top problems…

  4. Coding Strategies and Implementations of Compressive Sensing

    NASA Astrophysics Data System (ADS)

    Tsai, Tsung-Han

    information from a noisy environment. Using engineering efforts to accomplish the same task usually requires multiple detectors, advanced computational algorithms, or artificial intelligence systems. Compressive acoustic sensing incorporates acoustic metamaterials in compressive sensing theory to emulate the abilities of sound localization and selective attention. This research investigates and optimizes the sensing capacity and the spatial sensitivity of the acoustic sensor. The well-modeled acoustic sensor allows localizing multiple speakers in both stationary and dynamic auditory scene; and distinguishing mixed conversations from independent sources with high audio recognition rate.

  5. Efficient entropy coding for scalable video coding

    NASA Astrophysics Data System (ADS)

    Choi, Woong Il; Yang, Jungyoup; Jeon, Byeungwoo

    2005-10-01

    The standardization for the scalable extension of H.264 has called for additional functionality based on H.264 standard to support the combined spatio-temporal and SNR scalability. For the entropy coding of H.264 scalable extension, Context-based Adaptive Binary Arithmetic Coding (CABAC) scheme is considered so far. In this paper, we present a new context modeling scheme by using inter layer correlation between the syntax elements. As a result, it improves coding efficiency of entropy coding in H.264 scalable extension. In simulation results of applying the proposed scheme to encoding the syntax element mb_type, it is shown that improvement in coding efficiency of the proposed method is up to 16% in terms of bit saving due to estimation of more adequate probability model.

  6. Subsystem codes with spatially local generators

    SciTech Connect

    Bravyi, Sergey

    2011-01-15

    We study subsystem codes whose gauge group has local generators in two-dimensional (2D) geometry. It is shown that there exists a family of such codes defined on lattices of size LxL with the number of logical qubits k and the minimum distance d both proportional to L. The gauge group of these codes involves only two-qubit generators of type XX and ZZ coupling nearest-neighbor qubits (and some auxiliary one-qubit generators). Our proof is not constructive as it relies on a certain version of the Gilbert-Varshamov bound for classical codes. Along the way, we introduce and study properties of generalized Bacon-Shor codes that might be of independent interest. Secondly, we prove that any 2D subsystem [n,k,d] code with spatially local generators obeys upper bounds kd=O(n) and d{sup 2}=O(n). The analogous upper bound proved recently for 2D stabilizer codes is kd{sup 2}=O(n). Our results thus demonstrate that subsystem codes can be more powerful than stabilizer codes under the spatial locality constraint.

  7. Subsystem codes with spatially local generators

    NASA Astrophysics Data System (ADS)

    Bravyi, Sergey

    2011-01-01

    We study subsystem codes whose gauge group has local generators in two-dimensional (2D) geometry. It is shown that there exists a family of such codes defined on lattices of size L×L with the number of logical qubits k and the minimum distance d both proportional to L. The gauge group of these codes involves only two-qubit generators of type XX and ZZ coupling nearest-neighbor qubits (and some auxiliary one-qubit generators). Our proof is not constructive as it relies on a certain version of the Gilbert-Varshamov bound for classical codes. Along the way, we introduce and study properties of generalized Bacon-Shor codes that might be of independent interest. Secondly, we prove that any 2D subsystem [n,k,d] code with spatially local generators obeys upper bounds kd=O(n) and d2=O(n). The analogous upper bound proved recently for 2D stabilizer codes is kd2=O(n). Our results thus demonstrate that subsystem codes can be more powerful than stabilizer codes under the spatial locality constraint.

  8. Certifying Auto-Generated Flight Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen

    2008-01-01

    Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm

  9. NERO- a post-maximum supernova radiation transport code

    NASA Astrophysics Data System (ADS)

    Maurer, I.; Jerkstrand, A.; Mazzali, P. A.; Taubenberger, S.; Hachinger, S.; Kromer, M.; Sim, S.; Hillebrandt, W.

    2011-12-01

    The interpretation of supernova (SN) spectra is essential for deriving SN ejecta properties such as density and composition, which in turn can tell us about their progenitors and the explosion mechanism. A very large number of atomic processes are important for spectrum formation. Several tools for calculating SN spectra exist, but they mainly focus on the very early or late epochs. The intermediate phase, which requires a non-local thermodynamic equilibrium (NLTE) treatment of radiation transport has rarely been studied. In this paper, we present a new SN radiation transport code, NERO, which can look at those epochs. All the atomic processes are treated in full NLTE, under a steady-state assumption. This is a valid approach between roughly 50 and 500 days after the explosion depending on SN type. This covers the post-maximum photospheric and the early and the intermediate nebular phase. As a test, we compare NERO to the radiation transport code of Jerkstrand, Fransson & Kozma and to the nebular code of Mazzali et al. All three codes have been developed independently and a comparison provides a valuable opportunity to investigate their reliability. Currently, NERO is one-dimensional and can be used for predicting spectra of synthetic explosion models or for deriving SN properties by spectral modelling. To demonstrate this, we study the spectra of the 'normal' Type Ia supernova (SN Ia) 2005cf between 50 and 350 days after the explosion and identify most of the common SN Ia line features at post-maximum epochs.

  10. Reusable State Machine Code Generator

    NASA Astrophysics Data System (ADS)

    Hoffstadt, A. A.; Reyes, C.; Sommer, H.; Andolfato, L.

    2010-12-01

    The State Machine model is frequently used to represent the behaviour of a system, allowing one to express and execute this behaviour in a deterministic way. A graphical representation such as a UML State Chart diagram tames the complexity of the system, thus facilitating changes to the model and communication between developers and domain experts. We present a reusable state machine code generator, developed by the Universidad Técnica Federico Santa María and the European Southern Observatory. The generator itself is based on the open source project architecture, and uses UML State Chart models as input. This allows for a modular design and a clean separation between generator and generated code. The generated state machine code has well-defined interfaces that are independent of the implementation artefacts such as the middle-ware. This allows using the generator in the substantially different observatory software of the Atacama Large Millimeter Array and the ESO Very Large Telescope. A project-specific mapping layer for event and transition notification connects the state machine code to its environment, which can be the Common Software of these projects, or any other project. This approach even allows to automatically create tests for a generated state machine, using techniques from software testing, such as path-coverage.

  11. Coded continuous wave meteor radar

    NASA Astrophysics Data System (ADS)

    Vierinen, Juha; Chau, Jorge L.; Pfeffer, Nico; Clahsen, Matthias; Stober, Gunter

    2016-03-01

    The concept of a coded continuous wave specular meteor radar (SMR) is described. The radar uses a continuously transmitted pseudorandom phase-modulated waveform, which has several advantages compared to conventional pulsed SMRs. The coding avoids range and Doppler aliasing, which are in some cases problematic with pulsed radars. Continuous transmissions maximize pulse compression gain, allowing operation at lower peak power than a pulsed system. With continuous coding, the temporal and spectral resolution are not dependent on the transmit waveform and they can be fairly flexibly changed after performing a measurement. The low signal-to-noise ratio before pulse compression, combined with independent pseudorandom transmit waveforms, allows multiple geographically separated transmitters to be used in the same frequency band simultaneously without significantly interfering with each other. Because the same frequency band can be used by multiple transmitters, the same interferometric receiver antennas can be used to receive multiple transmitters at the same time. The principles of the signal processing are discussed, in addition to discussion of several practical ways to increase computation speed, and how to optimally detect meteor echoes. Measurements from a campaign performed with a coded continuous wave SMR are shown and compared with two standard pulsed SMR measurements. The type of meteor radar described in this paper would be suited for use in a large-scale multi-static network of meteor radar transmitters and receivers. Such a system would be useful for increasing the number of meteor detections to obtain improved meteor radar data products.

  12. Genetic code, hamming distance and stochastic matrices.

    PubMed

    He, Matthew X; Petoukhov, Sergei V; Ricci, Paolo E

    2004-09-01

    In this paper we use the Gray code representation of the genetic code C=00, U=10, G=11 and A=01 (C pairs with G, A pairs with U) to generate a sequence of genetic code-based matrices. In connection with these code-based matrices, we use the Hamming distance to generate a sequence of numerical matrices. We then further investigate the properties of the numerical matrices and show that they are doubly stochastic and symmetric. We determine the frequency distributions of the Hamming distances, building blocks of the matrices, decomposition and iterations of matrices. We present an explicit decomposition formula for the genetic code-based matrix in terms of permutation matrices, which provides a hypercube representation of the genetic code. It is also observed that there is a Hamiltonian cycle in a genetic code-based hypercube.

  13. The investigation of tethered satellite system dynamics

    NASA Technical Reports Server (NTRS)

    Lorenzini, E. C.

    1986-01-01

    The analysis of the rotational dynamics of the satellite was focused on the rotational amplitude increase of the satellite, with respect to the tether, during retrieval. The dependence of the rotational amplitude upon the tether tension variation to the power 1/4 was thoroughly investigated. The damping of rotational oscillations achievable by reel control was also quantified while an alternative solution that makes use of a lever arm attached with a universal joint to the satellite was proposed. Comparison simulations between the Smithsonian Astrophysical Observatory and the Martin Marietta (MMA) computer code of reteival maneuvers were also carried out. The agreement between the two, completely independent, codes was extremely close, demonstrating the reliability of the models. The slack tether dynamics during reel jams was analytically investigated in order to identify the limits of applicability of the SLACK3 computer code to this particular case. Test runs with SLACK3 were also carried out.

  14. Groundwater flow code verification ``benchmarking`` activity (COVE-2A): Analysis of participants` work

    SciTech Connect

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project.

  15. Honesty and Honor Codes.

    ERIC Educational Resources Information Center

    McCabe, Donald; Trevino, Linda Klebe

    2002-01-01

    Explores the rise in student cheating and evidence that students cheat less often at schools with an honor code. Discusses effective use of such codes and creation of a peer culture that condemns dishonesty. (EV)

  16. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-02-20

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  17. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-01-01

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  18. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  19. DIANE multiparticle transport code

    NASA Astrophysics Data System (ADS)

    Caillaud, M.; Lemaire, S.; Ménard, S.; Rathouit, P.; Ribes, J. C.; Riz, D.

    2014-06-01

    DIANE is the general Monte Carlo code developed at CEA-DAM. DIANE is a 3D multiparticle multigroup code. DIANE includes automated biasing techniques and is optimized for massive parallel calculations.

  20. American Independence. Fifth Grade.

    ERIC Educational Resources Information Center

    Crosby, Annette

    This fifth grade teaching unit covers early conflicts between the American colonies and Britain, battles of the American Revolutionary War, and the Declaration of Independence. Knowledge goals address the pre-revolutionary acts enforced by the British, the concepts of conflict and independence, and the major events and significant people from the…

  1. EMF wire code research

    SciTech Connect

    Jones, T.

    1993-11-01

    This paper examines the results of previous wire code research to determines the relationship with childhood cancer, wire codes and electromagnetic fields. The paper suggests that, in the original Savitz study, biases toward producing a false positive association between high wire codes and childhood cancer were created by the selection procedure.

  2. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  3. Maximal dinucleotide comma-free codes.

    PubMed

    Fimmel, Elena; Strüngmann, Lutz

    2016-01-21

    The problem of retrieval and maintenance of the correct reading frame plays a significant role in RNA transcription. Circular codes, and especially comma-free codes, can help to understand the underlying mechanisms of error-detection in this process. In recent years much attention has been paid to the investigation of trinucleotide circular codes (see, for instance, Fimmel et al., 2014; Fimmel and Strüngmann, 2015a; Michel and Pirillo, 2012; Michel et al., 2012, 2008), while dinucleotide codes had been touched on only marginally, even though dinucleotides are associated to important biological functions. Recently, all maximal dinucleotide circular codes were classified (Fimmel et al., 2015; Michel and Pirillo, 2013). The present paper studies maximal dinucleotide comma-free codes and their close connection to maximal dinucleotide circular codes. We give a construction principle for such codes and provide a graphical representation that allows them to be visualized geometrically. Moreover, we compare the results for dinucleotide codes with the corresponding situation for trinucleotide maximal self-complementary C(3)-codes. Finally, the results obtained are discussed with respect to Crick׳s hypothesis about frame-shift-detecting codes without commas.

  4. Interval coding. II. Dendrite-dependent mechanisms.

    PubMed

    Doiron, Brent; Oswald, Anne-Marie M; Maler, Leonard

    2007-04-01

    The rich temporal structure of neural spike trains provides multiple dimensions to code dynamic stimuli. Popular examples are spike trains from sensory cells where bursts and isolated spikes can serve distinct coding roles. In contrast to analyses of neural coding, the cellular mechanics of burst mechanisms are typically elucidated from the neural response to static input. Bridging the mechanics of bursting with coding of dynamic stimuli is an important step in establishing theories of neural coding. Electrosensory lateral line lobe (ELL) pyramidal neurons respond to static inputs with a complex dendrite-dependent burst mechanism. Here we show that in response to dynamic broadband stimuli, these bursts lack some of the electrophysiological characteristics observed in response to static inputs. A simple leaky integrate-and-fire (LIF)-style model with a dendrite-dependent depolarizing afterpotential (DAP) is sufficient to match both the output statistics and coding performance of experimental spike trains. We use this model to investigate a simplification of interval coding where the burst interspike interval (ISI) codes for the scale of a canonical upstroke rather than a multidimensional stimulus feature. Using this stimulus reduction, we compute a quantization of the burst ISIs and the upstroke scale to show that the mutual information rate of the interval code is maximized at a moderate DAP amplitude. The combination of a reduced description of ELL pyramidal cell bursting and a simplification of the interval code increases the generality of ELL burst codes to other sensory modalities.

  5. Mapping Local Codes to Read Codes.

    PubMed

    Bonney, Wilfred; Galloway, James; Hall, Christopher; Ghattas, Mikhail; Tramma, Leandro; Nind, Thomas; Donnelly, Louise; Jefferson, Emily; Doney, Alexander

    2017-01-01

    Background & Objectives: Legacy laboratory test codes make it difficult to use clinical datasets for meaningful translational research, where populations are followed for disease risk and outcomes over many years. The Health Informatics Centre (HIC) at the University of Dundee hosts continuous biochemistry data from the clinical laboratories in Tayside and Fife dating back as far as 1987. However, the HIC-managed biochemistry dataset is coupled with incoherent sample types and unstandardised legacy local test codes, which increases the complexity of using the dataset for reasonable population health outcomes. The objective of this study was to map the legacy local test codes to the Scottish 5-byte Version 2 Read Codes using biochemistry data extracted from the repository of the Scottish Care Information (SCI) Store.

  6. Reviewing the Challenges and Opportunities Presented by Code Switching and Mixing in Bangla

    ERIC Educational Resources Information Center

    Hasan, Md. Kamrul; Akhand, Mohd. Moniruzzaman

    2015-01-01

    This paper investigates the issues related to code-switching/code-mixing in an ESL context. Some preliminary data on Bangla-English code-switching/code-mixing has been analyzed in order to determine which structural pattern of code-switching/code-mixing is predominant in different social strata. This study also explores the relationship of…

  7. Reviewing the Challenges and Opportunities Presented by Code Switching and Mixing in Bangla

    ERIC Educational Resources Information Center

    Hasan, Md. Kamrul; Akhand, Mohd. Moniruzzaman

    2014-01-01

    This paper investigates the issues related to code-switching/code-mixing in an ESL context. Some preliminary data on Bangla-English code-switching/code-mixing has been analyzed in order to determine which structural pattern of code-switching/code-mixing is predominant in different social strata. This study also explores the relationship of…

  8. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  9. VizieR Online Data Catalog: Doppler tomography code doptomog (Kotze+, 2016)

    NASA Astrophysics Data System (ADS)

    Kotze, E. J.; Potter, S. B.; McBride, V. A.

    2016-10-01

    doptomog is based on the fast maximum entropy Doppler tomography code presented by Spruit (1998, astro-ph/9806141). The doptomog code can construct tomograms independently in the standard and the inside-out velocity coordinate frames. (2 data files).

  10. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  11. DLLExternalCode

    SciTech Connect

    Greg Flach, Frank Smith

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  12. Defeating the coding monsters.

    PubMed

    Colt, Ross

    2007-02-01

    Accuracy in coding is rapidly becoming a required skill for military health care providers. Clinic staffing, equipment purchase decisions, and even reimbursement will soon be based on the coding data that we provide. Learning the complicated myriad of rules to code accurately can seem overwhelming. However, the majority of clinic visits in a typical outpatient clinic generally fall into two major evaluation and management codes, 99213 and 99214. If health care providers can learn the rules required to code a 99214 visit, then this will provide a 90% solution that can enable them to accurately code the majority of their clinic visits. This article demonstrates a step-by-step method to code a 99214 visit, by viewing each of the three requirements as a monster to be defeated.

  13. Progress in The Semantic Analysis of Scientific Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  14. A draft model aggregated code of ethics for bioethicists.

    PubMed

    Baker, Robert

    2005-01-01

    Bioethicists function in an environment in which their peers--healthcare executives, lawyers, nurses, physicians--assert the integrity of their fields through codes of professional ethics. Is it time for bioethics to assert its integrity by developing a code of ethics? Answering in the affirmative, this paper lays out a case by reviewing the historical nature and function of professional codes of ethics. Arguing that professional codes are aggregative enterprises growing in response to a field's historical experiences, it asserts that bioethics now needs to assert its integrity and independence and has already developed a body of formal statements that could be aggregated to create a comprehensive code of ethics for bioethics. A Draft Model Aggregated Code of Ethics for Bioethicists is offered in the hope that analysis and criticism of this draft code will promote further discussion of the nature and content of a code of ethics for bioethicists.

  15. Source Code Plagiarism--A Student Perspective

    ERIC Educational Resources Information Center

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  16. Applications of Coding in Network Communications

    ERIC Educational Resources Information Center

    Chang, Christopher SungWook

    2012-01-01

    This thesis uses the tool of network coding to investigate fast peer-to-peer file distribution, anonymous communication, robust network construction under uncertainty, and prioritized transmission. In a peer-to-peer file distribution system, we use a linear optimization approach to show that the network coding framework significantly simplifies…

  17. Media independent interface

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The work done on the Media Independent Interface (MII) Interface Control Document (ICD) program is described and recommendations based on it were made. Explanations and rationale for the content of the ICD itself are presented.

  18. Astronomy education and the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, Robert J.

    2016-01-01

    The Astrophysics Source Code Library (ASCL) is an online registry of source codes used in refereed astrophysics research. It currently lists nearly 1,200 codes and covers all aspects of computational astrophysics. How can this resource be of use to educators and to the graduate students they mentor? The ASCL serves as a discovery tool for codes that can be used for one's own research. Graduate students can also investigate existing codes to see how common astronomical problems are approached numerically in practice, and use these codes as benchmarks for their own solutions to these problems. Further, they can deepen their knowledge of software practices and techniques through examination of others' codes.

  19. On the role of code comparisons in verification and validation.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2003-08-01

    This report presents a perspective on the role of code comparison activities in verification and validation. We formally define the act of code comparison as the Code Comparison Principle (CCP) and investigate its application in both verification and validation. One of our primary conclusions is that the use of code comparisons for validation is improper and dangerous. We also conclude that while code comparisons may be argued to provide a beneficial component in code verification activities, there are higher quality code verification tasks that should take precedence. Finally, we provide a process for application of the CCP that we believe is minimal for achieving benefit in verification processes.

  20. Steroid-independent male sexual behavior in B6D2F2 male mice.

    PubMed

    McInnis, Christine M; Venu, Samitha; Park, Jin Ho

    2016-09-01

    It is well established that male sexual behavior (MSB) is regulated by gonadal steroids; however, individual differences in MSB, independent of gonadal steroids, are prevalent across a wide range of species, and further investigation is necessary to advance our understanding of steroid-independent MSB. Studies utilizing B6D2F1 hybrid male mice in which a significant proportion retain MSB after long-term orchidectomy, identified as steroid-independent-maters (SI-maters), have begun to unravel the genetic underpinnings of steroid-independent MSB. A recent study demonstrated that steroid-independent MSB is a heritable behavioral phenotype that is mainly passed down from B6D2F1 hybrid SI-maters when crossed with C57BL6J female mice. To begin to uncover whether the strain of the dam plays a role in the inheritance of steroid-independent MSB, B6D2F1 hybrid females were crossed with B6D2F1 hybrid males. While the present study confirms the finding that steroid-independent MSB is a heritable behavioral phenotype and that SI-mater sires are more likely to pass down some components of MSB than SI-non-maters to their offspring, it also reveals that the B6D2F2 male offspring that were identified as SI-maters that displayed the full repertoire of steroid-independent MSB had the same probability of being sired from either a B6D2F1 SI-mater or SI-non-mater. These results, in conjunction with previous findings, indicate that the specific chromosomal loci pattern that codes for steroid-independent MSB in the B6D2F2 male offspring may result regardless of whether the father was a SI-mater or SI-non-mater, and that the maternal strain may be an important factor in the inheritance of steroid-independent MSB.

  1. Description of ground motion data processing codes: Volume 3

    SciTech Connect

    Sanders, M.L.

    1988-02-01

    Data processing codes developed to process ground motion at the Nevada Test Site for the Weapons Test Seismic Investigations Project are used today as part of the program to process ground motion records for the Nevada Nuclear Waste Storage Investigations Project. The work contained in this report documents and lists codes and verifies the ``PSRV`` code. 39 figs.

  2. On the parallelization of molecular dynamics codes

    NASA Astrophysics Data System (ADS)

    Trabado, G. P.; Plata, O.; Zapata, E. L.

    2002-08-01

    Molecular dynamics (MD) codes present a high degree of spatial data locality and a significant amount of independent computations. However, most of the parallelization strategies are usually based on the manual transformation of sequential programs either by completely rewriting the code with message passing routines or using specific libraries intended for writing new MD programs. In this paper we propose a new library-based approach (DDLY) which supports parallelization of existing short-range MD sequential codes. The novelty of this approach is that it can directly handle the distribution of common data structures used in MD codes to represent data (arrays, Verlet lists, link cells), using domain decomposition. Thus, the insertion of run-time support for distribution and communication in a MD program does not imply significant changes to its structure. The method is simple, efficient and portable. It may be also used to extend existing parallel programming languages, such as HPF.

  3. In Vivo Imaging Reveals Composite Coding for Diagonal Motion in the Drosophila Visual System

    PubMed Central

    Zhou, Wei; Chang, Jin

    2016-01-01

    Understanding information coding is important for resolving the functions of visual neural circuits. The motion vision system is a classic model for studying information coding as it contains a concise and complete information-processing circuit. In Drosophila, the axon terminals of motion-detection neurons (T4 and T5) project to the lobula plate, which comprises four regions that respond to the four cardinal directions of motion. The lobula plate thus represents a topographic map on a transverse plane. This enables us to study the coding of diagonal motion by investigating its response pattern. By using in vivo two-photon calcium imaging, we found that the axon terminals of T4 and T5 cells in the lobula plate were activated during diagonal motion. Further experiments showed that the response to diagonal motion is distributed over the following two regions compared to the cardinal directions of motion—a diagonal motion selective response region and a non-selective response region—which overlap with the response regions of the two vector-correlated cardinal directions of motion. Interestingly, the sizes of the non-selective response regions are linearly correlated with the angle of the diagonal motion. These results revealed that the Drosophila visual system employs a composite coding for diagonal motion that includes both independent coding and vector decomposition coding. PMID:27695103

  4. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  5. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  6. cncRNAs: Bi-functional RNAs with protein coding and non-coding functions.

    PubMed

    Kumari, Pooja; Sampath, Karuna

    2015-12-01

    For many decades, the major function of mRNA was thought to be to provide protein-coding information embedded in the genome. The advent of high-throughput sequencing has led to the discovery of pervasive transcription of eukaryotic genomes and opened the world of RNA-mediated gene regulation. Many regulatory RNAs have been found to be incapable of protein coding and are hence termed as non-coding RNAs (ncRNAs). However, studies in recent years have shown that several previously annotated non-coding RNAs have the potential to encode proteins, and conversely, some coding RNAs have regulatory functions independent of the protein they encode. Such bi-functional RNAs, with both protein coding and non-coding functions, which we term as 'cncRNAs', have emerged as new players in cellular systems. Here, we describe the functions of some cncRNAs identified from bacteria to humans. Because the functions of many RNAs across genomes remains unclear, we propose that RNAs be classified as coding, non-coding or both only after careful analysis of their functions.

  7. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  8. Embedded wavelet video coding with error concealment

    NASA Astrophysics Data System (ADS)

    Chang, Pao-Chi; Chen, Hsiao-Ching; Lu, Ta-Te

    2000-04-01

    We present an error-concealed embedded wavelet (ECEW) video coding system for transmission over Internet or wireless networks. This system consists of two types of frames: intra (I) frames and inter, or predicted (P), frames. Inter frames are constructed by the residual frames formed by variable block-size multiresolution motion estimation (MRME). Motion vectors are compressed by arithmetic coding. The image data of intra frames and residual frames are coded by error-resilient embedded zerotree wavelet (ER-EZW) coding. The ER-EZW coding partitions the wavelet coefficients into several groups and each group is coded independently. Therefore, the error propagation effect resulting from an error is only confined in a group. In EZW coding any single error may result in a totally undecodable bitstream. To further reduce the error damage, we use the error concealment at the decoding end. In intra frames, the erroneous wavelet coefficients are replaced by neighbors. In inter frames, erroneous blocks of wavelet coefficients are replaced by data from the previous frame. Simulations show that the performance of ECEW is superior to ECEW without error concealment by 7 to approximately 8 dB at the error-rate of 10-3 in intra frames. The improvement still has 2 to approximately 3 dB at a higher error-rate of 10-2 in inter frames.

  9. Independent component analysis in spiking neurons.

    PubMed

    Savin, Cristina; Joshi, Prashant; Triesch, Jochen

    2010-04-22

    Although models based on independent component analysis (ICA) have been successful in explaining various properties of sensory coding in the cortex, it remains unclear how networks of spiking neurons using realistic plasticity rules can realize such computation. Here, we propose a biologically plausible mechanism for ICA-like learning with spiking neurons. Our model combines spike-timing dependent plasticity and synaptic scaling with an intrinsic plasticity rule that regulates neuronal excitability to maximize information transmission. We show that a stochastically spiking neuron learns one independent component for inputs encoded either as rates or using spike-spike correlations. Furthermore, different independent components can be recovered, when the activity of different neurons is decorrelated by adaptive lateral inhibition.

  10. Topological subsystem codes

    SciTech Connect

    Bombin, H.

    2010-03-15

    We introduce a family of two-dimensional (2D) topological subsystem quantum error-correcting codes. The gauge group is generated by two-local Pauli operators, so that two-local measurements are enough to recover the error syndrome. We study the computational power of code deformation in these codes and show that boundaries cannot be introduced in the usual way. In addition, we give a general mapping connecting suitable classical statistical mechanical models to optimal error correction in subsystem stabilizer codes that suffer from depolarizing noise.

  11. FAA Smoke Transport Code

    SciTech Connect

    Domino, Stefan; Luketa-Hanlin, Anay; Gallegos, Carlos

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a code obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.

  12. Transonic airfoil codes

    NASA Technical Reports Server (NTRS)

    Garabedian, P. R.

    1979-01-01

    Computer codes for the design and analysis of transonic airfoils are considered. The design code relies on the method of complex characteristics in the hodograph plane to construct shockless airfoil. The analysis code uses artificial viscosity to calculate flows with weak shock waves at off-design conditions. Comparisons with experiments show that an excellent simulation of two dimensional wind tunnel tests is obtained. The codes have been widely adopted by the aircraft industry as a tool for the development of supercritical wing technology.

  13. Investigation of the Performance of Various CVD Diamond Crystal Qualities for the Measurement of Radiation Doses from a Low Energy Mammography X-Ray Beam, Compared with MC Code (PENELOPE) Calculations

    NASA Astrophysics Data System (ADS)

    Zakari, Y. I.; Mavunda, R. D.; Nam, T. L.; Keddy, R. J.

    The tissue equivalence of diamond allows for accurate radiation dose determination without large corrections for different attenuation values in biological tissue, but its low Z value limits this advantage however to the lower energy photons such as for example in Mammography X-ray beams. This paper assays the performance of nine Chemical Vapour Deposition (CVD) diamonds for use as radiation sensing material. The specimens fabricated in wafer form are classified as detector grade, optical grade and single crystals. It is well known that the presence of defects in diamonds, including CVD specimens, not only dictates but also affects the responds of diamond to radiation in different ways. In this investigation, tools such as electron spin resonance (ESR), thermoluminescence (TL) Raman spectroscopy and ultra violet (UV) spectroscopy were used to probe each of the samples. The linearity, sensitivity and other characteristics of the detector to photon interaction was analyzed, and from the I-V characteristics. The diamonds categorized into four each, of the so called Detector and Optical grades, and a single crystal CVD were exposed to low X-ray peak voltage range (22 to 27 KVp) with a trans-crystal polarizing fields of 0.4 kV.cm-1, 0.66 kV.cm-1 and 0.8 kV.cm-1. The presentation discusses the presence of defects identifiable by the techniques used and correlates the radiation performance of the three types of crystals to their presence. The choice of a wafer as either a spectrometer or as X-ray dosimeter within the selected energy range was made. The analyses was validated with Monte-Carlo code (PENELOPE)

  14. Independent technical review, handbook

    SciTech Connect

    Not Available

    1994-02-01

    Purpose Provide an independent engineering review of the major projects being funded by the Department of Energy, Office of Environmental Restoration and Waste Management. The independent engineering review will address questions of whether the engineering practice is sufficiently developed to a point where a major project can be executed without significant technical problems. The independent review will focus on questions related to: (1) Adequacy of development of the technical base of understanding; (2) Status of development and availability of technology among the various alternatives; (3) Status and availability of the industrial infrastructure to support project design, equipment fabrication, facility construction, and process and program/project operation; (4) Adequacy of the design effort to provide a sound foundation to support execution of project; (5) Ability of the organization to fully integrate the system, and direct, manage, and control the execution of a complex major project.

  15. Homebirth and independent midwifery.

    PubMed

    Harris, G

    2000-07-01

    Why do women choose to give birth at home, and midwives to work independently, in a culture that does little to support this option? This article looks at the reasons childbearing women and midwives make these choices and the barriers to achieving them. The safety of the homebirth option is supported in reference to analysis of mortality and morbidity. Homebirth practices and level of success are compared in Australia and New Zealand (NZ), in particular, and The Netherlands, England and America. The success of popularity of homebirths is analysed in terms of socio-economic status. The current situation and challenges of independent midwifery in Darwin are described.

  16. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    PubMed Central

    Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions. PMID:26999741

  17. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding.

    PubMed

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions.

  18. Radio Losses for Concatenated Codes

    NASA Astrophysics Data System (ADS)

    Shambayati, S.

    2002-07-01

    The advent of higher powered spacecraft amplifiers and better ground receivers capable of tracking spacecraft carrier signals with narrower loop bandwidths requires better understanding of the carrier tracking loss (radio loss) mechanism of the concatenated codes used for deep-space missions. In this article, we present results of simulations performed for a (7,1/2), Reed-Solomon (255,223), interleaver depth-5 concatenated code in order to shed some light on this issue. Through these simulations, we obtained the performance of this code over an additive white Gaussian noise (AWGN) channel (the baseline performance) in terms of both its frame-error rate (FER) and its bit-error rate at the output of the Reed-Solomon decoder (RS-BER). After obtaining these results, we curve fitted the baseline performance curves for FER and RS-BER and calculated the high-rate radio losses for this code for an FER of 10^(-4) and its corresponding baseline RS-BER of 2.1 x 10^(-6) for a carrier loop signal-to-noise ratio (SNR) of 14.8 dB. This calculation revealed that even though over the AWGN channel the FER value and the RS-BER value correspond to each other (i.e., these values are obtained by the same bit SNR value), the RS-BER value has higher high-rate losses than does the FER value. Furthermore, this calculation contradicted the previous assumption th at at high data rates concatenated codes have the same radio losses as their constituent convolutional codes. Our results showed much higher losses for the FER and the RS-BER (by as much as 2 dB) than for the corresponding baseline BER of the convolutional code. Further simulations were performed to investigate the effects of changes in the data rate on the code's radio losses. It was observed that as the data rate increased the radio losses for both the FER and the RS-BER approached their respective calculated high-rate values. Furthermore, these simulations showed that a simple two-parameter function could model the increase in the

  19. Dress Codes for Teachers?

    ERIC Educational Resources Information Center

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  20. Lichenase and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2000-08-15

    The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

  1. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  2. Coding Acoustic Metasurfaces.

    PubMed

    Xie, Boyang; Tang, Kun; Cheng, Hua; Liu, Zhengyou; Chen, Shuqi; Tian, Jianguo

    2017-02-01

    Coding acoustic metasurfaces can combine simple logical bits to acquire sophisticated functions in wave control. The acoustic logical bits can achieve a phase difference of exactly π and a perfect match of the amplitudes for the transmitted waves. By programming the coding sequences, acoustic metasurfaces with various functions, including creating peculiar antenna patterns and waves focusing, have been demonstrated.

  3. Computerized mega code recording.

    PubMed

    Burt, T W; Bock, H C

    1988-04-01

    A system has been developed to facilitate recording of advanced cardiac life support mega code testing scenarios. By scanning a paper "keyboard" using a bar code wand attached to a portable microcomputer, the person assigned to record the scenario can easily generate an accurate, complete, timed, and typewritten record of the given situations and the obtained responses.

  4. Pseudonoise code tracking loop

    NASA Technical Reports Server (NTRS)

    Laflame, D. T. (Inventor)

    1980-01-01

    A delay-locked loop is presented for tracking a pseudonoise (PN) reference code in an incoming communication signal. The loop is less sensitive to gain imbalances, which can otherwise introduce timing errors in the PN reference code formed by the loop.

  5. Utilizing GPUs to Accelerate Turbomachinery CFD Codes

    NASA Technical Reports Server (NTRS)

    MacCalla, Weylin; Kulkarni, Sameer

    2016-01-01

    GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.

  6. Distributed joint source-channel coding in wireless sensor networks.

    PubMed

    Zhu, Xuqi; Liu, Yu; Zhang, Lin

    2009-01-01

    Considering the fact that sensors are energy-limited and the wireless channel conditions in wireless sensor networks, there is an urgent need for a low-complexity coding method with high compression ratio and noise-resisted features. This paper reviews the progress made in distributed joint source-channel coding which can address this issue. The main existing deployments, from the theory to practice, of distributed joint source-channel coding over the independent channels, the multiple access channels and the broadcast channels are introduced, respectively. To this end, we also present a practical scheme for compressing multiple correlated sources over the independent channels. The simulation results demonstrate the desired efficiency.

  7. Postcard from Independence, Mo.

    ERIC Educational Resources Information Center

    Archer, Jeff

    2004-01-01

    This article reports results showing that the Independence, Missori school district failed to meet almost every one of its improvement goals under the No Child Left Behind Act. The state accreditation system stresses improvement over past scores, while the federal law demands specified amounts of annual progress toward the ultimate goal of 100…

  8. Native American Independent Living.

    ERIC Educational Resources Information Center

    Clay, Julie Anna

    1992-01-01

    Examines features of independent living philosophy with regard to compatibility with Native American cultures, including definition or conceptualization of disability; self-advocacy; systems advocacy; peer counseling; and consumer control and involvement. Discusses an actualizing process as one method of resolving cultural conflicts and…

  9. Touchstones of Independence.

    ERIC Educational Resources Information Center

    Roha, Thomas Arden

    1999-01-01

    Foundations affiliated with public higher education institutions can avoid having to open records for public scrutiny, by having independent boards of directors, occupying leased office space or paying market value for university space, using only foundation personnel, retaining legal counsel, being forthcoming with information and use of public…

  10. Independent Video in Britain.

    ERIC Educational Resources Information Center

    Stewart, David

    Maintaining the status quo as well as the attitude toward cultural funding and development that it imposes on video are detrimental to the formation of a thriving video network, and also out of key with the present social and political situation in Britain. Independent video has some quite specific advantages as a medium for cultural production…

  11. Caring about Independent Lives

    ERIC Educational Resources Information Center

    Christensen, Karen

    2010-01-01

    With the rhetoric of independence, new cash for care systems were introduced in many developed welfare states at the end of the 20th century. These systems allow local authorities to pay people who are eligible for community care services directly, to enable them to employ their own careworkers. Despite the obvious importance of the careworker's…

  12. Independence and Survival.

    ERIC Educational Resources Information Center

    James, H. Thomas

    Independent schools that are of viable size, well managed, and strategically located to meet competition will survive and prosper past the current financial crisis. We live in a complex technological society with insatiable demands for knowledgeable people to keep it running. The future will be marked by the orderly selection of qualified people,…

  13. Evolving genetic code

    PubMed Central

    OHAMA, Takeshi; INAGAKI, Yuji; BESSHO, Yoshitaka; OSAWA, Syozo

    2008-01-01

    In 1985, we reported that a bacterium, Mycoplasma capricolum, used a deviant genetic code, namely UGA, a “universal” stop codon, was read as tryptophan. This finding, together with the deviant nuclear genetic codes in not a few organisms and a number of mitochondria, shows that the genetic code is not universal, and is in a state of evolution. To account for the changes in codon meanings, we proposed the codon capture theory stating that all the code changes are non-disruptive without accompanied changes of amino acid sequences of proteins. Supporting evidence for the theory is presented in this review. A possible evolutionary process from the ancient to the present-day genetic code is also discussed. PMID:18941287

  14. Combustion chamber analysis code

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-01-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  15. GALPROP: New Developments in CR Propagation Code

    NASA Technical Reports Server (NTRS)

    Moskalenko, I. V.; Jones, F. C.; Mashnik, S. G.; Strong, A. W.; Ptuskin, V. S.

    2003-01-01

    The numerical Galactic CR propagation code GALPROP has been shown to reproduce simultaneously observational data of many kinds related to CR origin and propagation. It has been validated on direct measurements of nuclei, antiprotons, electrons, positrons as well as on astronomical measurements of gamma rays and synchrotron radiation. Such data provide many independent constraints on model parameters while revealing some contradictions in the conventional view of Galactic CR propagation. Using a new version of GALPROP we study new effects such as processes of wave-particle interactions in the interstellar medium. We also report about other developments in the CR propagation code.

  16. Azerbaijani-Russian Code-Switching and Code-Mixing: Form, Function, and Identity

    ERIC Educational Resources Information Center

    Zuercher, Kenneth

    2009-01-01

    From incorporation into the Russian Empire in 1828, through the collapse of the U.S.S.R. in 1991 governmental language policies and other socio/political forces influenced the Turkic population of the Republic of Azerbaijan to speak Russian. Even with changes since independence Russian use--including various kinds of code-switching and…

  17. P-code enhanced method for processing encrypted GPS signals without knowledge of the encryption code

    NASA Technical Reports Server (NTRS)

    Meehan, Thomas K. (Inventor); Thomas, Jr., Jess Brooks (Inventor); Young, Lawrence E. (Inventor)

    2000-01-01

    In the preferred embodiment, an encrypted GPS signal is down-converted from RF to baseband to generate two quadrature components for each RF signal (L1 and L2). Separately and independently for each RF signal and each quadrature component, the four down-converted signals are counter-rotated with a respective model phase, correlated with a respective model P code, and then successively summed and dumped over presum intervals substantially coincident with chips of the respective encryption code. Without knowledge of the encryption-code signs, the effect of encryption-code sign flips is then substantially reduced by selected combinations of the resulting presums between associated quadrature components for each RF signal, separately and independently for the L1 and L2 signals. The resulting combined presums are then summed and dumped over longer intervals and further processed to extract amplitude, phase and delay for each RF signal. Precision of the resulting phase and delay values is approximately four times better than that obtained from straight cross-correlation of L1 and L2. This improved method provides the following options: separate and independent tracking of the L1-Y and L2-Y channels; separate and independent measurement of amplitude, phase and delay L1-Y channel; and removal of the half-cycle ambiguity in L1-Y and L2-Y carrier phase.

  18. Hardware independence checkout software

    NASA Technical Reports Server (NTRS)

    Cameron, Barry W.; Helbig, H. R.

    1990-01-01

    ACSI has developed a program utilizing CLIPS to assess compliance with various programming standards. Essentially the program parses C code to extract the names of all function calls. These are asserted as CLIPS facts which also include information about line numbers, source file names, and called functions. Rules have been devised to establish functions called that have not been defined in any of the source parsed. These are compared against lists of standards (represented as facts) using rules that check intersections and/or unions of these. By piping the output into other processes the source is appropriately commented by generating and executing parsed scripts.

  19. Pyramid image codes

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1990-01-01

    All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

  20. Report number codes

    SciTech Connect

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  1. Embedded foveation image coding.

    PubMed

    Wang, Z; Bovik, A C

    2001-01-01

    The human visual system (HVS) is highly space-variant in sampling, coding, processing, and understanding. The spatial resolution of the HVS is highest around the point of fixation (foveation point) and decreases rapidly with increasing eccentricity. By taking advantage of this fact, it is possible to remove considerable high-frequency information redundancy from the peripheral regions and still reconstruct a perceptually good quality image. Great success has been obtained previously by a class of embedded wavelet image coding algorithms, such as the embedded zerotree wavelet (EZW) and the set partitioning in hierarchical trees (SPIHT) algorithms. Embedded wavelet coding not only provides very good compression performance, but also has the property that the bitstream can be truncated at any point and still be decoded to recreate a reasonably good quality image. In this paper, we propose an embedded foveation image coding (EFIC) algorithm, which orders the encoded bitstream to optimize foveated visual quality at arbitrary bit-rates. A foveation-based image quality metric, namely, foveated wavelet image quality index (FWQI), plays an important role in the EFIC system. We also developed a modified SPIHT algorithm to improve the coding efficiency. Experiments show that EFIC integrates foveation filtering with foveated image coding and demonstrates very good coding performance and scalability in terms of foveated image quality measurement.

  2. Distributed transform coding via source-splitting

    NASA Astrophysics Data System (ADS)

    Yahampath, Pradeepa

    2012-12-01

    Transform coding (TC) is one of the best known practical methods for quantizing high-dimensional vectors. In this article, a practical approach to distributed TC of jointly Gaussian vectors is presented. This approach, referred to as source-split distributed transform coding (SP-DTC), can be used to easily implement two terminal transform codes for any given rate-pair. The main idea is to apply source-splitting using orthogonal-transforms, so that only Wyner-Ziv (WZ) quantizers are required for compression of transform coefficients. This approach however requires optimizing the bit allocation among dependent sets of WZ quantizers. In order to solve this problem, a low-complexity tree-search algorithm based on analytical models for transform coefficient quantization is developed. A rate-distortion (RD) analysis of SP-DTCs for jointly Gaussian sources is presented, which indicates that these codes can significantly outperform the practical alternative of independent TC of each source, whenever there is a strong correlation between the sources. For practical implementation of SP-DTCs, the idea of using conditional entropy constrained (CEC) quantizers followed by Slepian-Wolf coding is explored. Experimental results obtained with SP-DTC designs based on both CEC scalar quantizers and CEC trellis-coded quantizers demonstrate that actual implementations of SP-DTCs can achieve RD performance close to the analytically predicted limits.

  3. MHDust: A 3-fluid dusty plasma code

    NASA Astrophysics Data System (ADS)

    Lazerson, Samuel

    MHDust is a next generation 3-fluid magnetized dusty plasma code, treating the inertial dynamics of both the dust and ion components. Coded in ANSI C, the numerical method employs Leap-Frog and Dufort-Frankel integration schemes. Features include: nonlinear collisional terms, quasi-neutrality or continuity based electron densities, and dynamical dust charge number. Tests of wave-mode propagation (Acoustic and Electromagnetic) allow a comparison to linear wave mode theory. Additional nonlinear phenomena are presented including magnetic reconnection and shear-flow instabilities. Relevant parameters for the space environment are considered, allowing a comparison to be made with previous dusty plasma codes (DENISIS). The utility of the code is expanded through the possibility of small dust mass. This allows MH- Dust to be used as a 2-ion plasma code. MHDust considerably expands the range of numerical investigations into nonlinear phenomena in the field of astrophysical dusty plasmas.

  4. Irregular Repeat-Accumulate Codes for Volume Holographic Memory Systems

    NASA Astrophysics Data System (ADS)

    Pishro-Nik, Hossein; Fekri, Faramarz

    2004-09-01

    We investigate the application of irregular repeat-accumulate (IRA) codes in volume holographic memory (VHM) systems. We introduce methodologies to design efficient IRA codes. We show that a judiciously designed IRA code for a typical VHM can be as good as the optimized irregular low-density-parity-check codes while having the additional advantage of lower encoding complexity. Moreover, we present a method to reduce the error-floor effect of the IRA codes in the VHM systems. This method explores the structure of the noise pattern in holographic memories. Finally, we explain why IRA codes are good candidates for the VHM systems.

  5. Agent independent task planning

    NASA Technical Reports Server (NTRS)

    Davis, William S.

    1990-01-01

    Agent-Independent Planning is a technique that allows the construction of activity plans without regard to the agent that will perform them. Once generated, a plan is then validated and translated into instructions for a particular agent, whether a robot, crewmember, or software-based control system. Because Space Station Freedom (SSF) is planned for orbital operations for approximately thirty years, it will almost certainly experience numerous enhancements and upgrades, including upgrades in robotic manipulators. Agent-Independent Planning provides the capability to construct plans for SSF operations, independent of specific robotic systems, by combining techniques of object oriented modeling, nonlinear planning and temporal logic. Since a plan is validated using the physical and functional models of a particular agent, new robotic systems can be developed and integrated with existing operations in a robust manner. This technique also provides the capability to generate plans for crewmembers with varying skill levels, and later apply these same plans to more sophisticated robotic manipulators made available by evolutions in technology.

  6. International exploration by independent

    SciTech Connect

    Bertragne, R.G.

    1992-04-01

    Recent industry trends indicate that the smaller U.S. independents are looking at foreign exploration opportunities as one of the alternatives for growth in the new age of exploration. Foreign finding costs per barrel usually are accepted to be substantially lower than domestic costs because of the large reserve potential of international plays. To get involved in overseas exploration, however, requires the explorationist to adapt to different cultural, financial, legal, operational, and political conditions. Generally, foreign exploration proceeds at a slower pace than domestic exploration because concessions are granted by a country's government, or are explored in partnership with a national oil company. First, the explorationist must prepare a mid- to long-term strategy, tailored to the goals and the financial capabilities of the company; next, is an ongoing evaluation of quality prospects in various sedimentary basins, and careful planning and conduct of the operations. To successfully explore overseas also requires the presence of a minimum number of explorationists and engineers thoroughly familiar with the various exploratory and operational aspects of foreign work. Ideally, these team members will have had a considerable amount of on-site experience in various countries and climates. Independents best suited for foreign expansion are those who have been financially successful in domestic exploration. When properly approached, foreign exploration is well within the reach of smaller U.S. independents, and presents essentially no greater risk than domestic exploration; however, the reward can be much larger and can catapult the company into the 'big leagues.'

  7. International exploration by independents

    SciTech Connect

    Bertagne, R.G. )

    1991-03-01

    Recent industry trends indicate that the smaller US independents are looking at foreign exploration opportunities as one of the alternatives for growth in the new age of exploration. It is usually accepted that foreign finding costs per barrel are substantially lower than domestic because of the large reserve potential of international plays. To get involved overseas requires, however, an adaptation to different cultural, financial, legal, operational, and political conditions. Generally foreign exploration proceeds at a slower pace than domestic because concessions are granted by the government, or are explored in partnership with the national oil company. First, a mid- to long-term strategy, tailored to the goals and the financial capabilities of the company, must be prepared; it must be followed by an ongoing evaluation of quality prospects in various sedimentary basins, and a careful planning and conduct of the operations. To successfully explore overseas also requires the presence on the team of a minimum number of explorationists and engineers thoroughly familiar with the various exploratory and operational aspects of foreign work, having had a considerable amount of onsite experience in various geographical and climatic environments. Independents that are best suited for foreign expansion are those that have been financially successful domestically, and have a good discovery track record. When properly approached foreign exploration is well within the reach of smaller US independents and presents essentially no greater risk than domestic exploration; the reward, however, can be much larger and can catapult the company into the big leagues.

  8. 32 CFR 634.25 - Installation traffic codes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 4 2011-07-01 2011-07-01 false Installation traffic codes. 634.25 Section 634... ENFORCEMENT AND CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION Traffic Supervision § 634.25 Installation traffic codes. (a) Installation or activity commanders will establish a traffic code for...

  9. 32 CFR 634.25 - Installation traffic codes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 4 2013-07-01 2013-07-01 false Installation traffic codes. 634.25 Section 634... ENFORCEMENT AND CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION Traffic Supervision § 634.25 Installation traffic codes. (a) Installation or activity commanders will establish a traffic code for...

  10. "Hour of Code": Can It Change Students' Attitudes toward Programming?

    ERIC Educational Resources Information Center

    Du, Jie; Wimmer, Hayden; Rada, Roy

    2016-01-01

    The Hour of Code is a one-hour introduction to computer science organized by Code.org, a non-profit dedicated to expanding participation in computer science. This study investigated the impact of the Hour of Code on students' attitudes towards computer programming and their knowledge of programming. A sample of undergraduate students from two…

  11. Axisymmetric generalized harmonic evolution code

    SciTech Connect

    Sorkin, Evgeny

    2010-04-15

    We describe the first axisymmetric numerical code based on the generalized harmonic formulation of the Einstein equations, which is regular at the axis. We test the code by investigating gravitational collapse of distributions of complex scalar field in a Kaluza-Klein spacetime. One of the key issues of the harmonic formulation is the choice of the gauge source functions, and we conclude that a damped-wave gauge is remarkably robust in this case. Our preliminary study indicates that evolution of regular initial data leads to formation both of black holes with spherical and cylindrical horizon topologies. Intriguingly, we find evidence that near threshold for black hole formation the number of outcomes proliferates. Specifically, the collapsing matter splits into individual pulses, two of which travel in the opposite directions along the compact dimension and one which is ejected radially from the axis. Depending on the initial conditions, a curvature singularity develops inside the pulses.

  12. Compressible Astrophysics Simulation Code

    SciTech Connect

    Howell, L.; Singer, M.

    2007-07-18

    This is an astrophysics simulation code involving a radiation diffusion module developed at LLNL coupled to compressible hydrodynamics and adaptive mesh infrastructure developed at LBNL. One intended application is to neutrino diffusion in core collapse supernovae.

  13. 3D neutronic codes coupled with thermal-hydraulic system codes for PWR, and BWR and VVER reactors

    SciTech Connect

    Langenbuch, S.; Velkov, K.; Lizorkin, M.

    1997-07-01

    This paper describes the objectives of code development for coupling 3D neutronics codes with thermal-hydraulic system codes. The present status of coupling ATHLET with three 3D neutronics codes for VVER- and LWR-reactors is presented. After describing the basic features of the 3D neutronic codes BIPR-8 from Kurchatov-Institute, DYN3D from Research Center Rossendorf and QUABOX/CUBBOX from GRS, first applications of coupled codes for different transient and accident scenarios are presented. The need of further investigations is discussed.

  14. Coding Strategies for X-ray Tomography

    NASA Astrophysics Data System (ADS)

    Holmgren, Andrew

    This work focuses on the construction and application of coded apertures to compressive X-ray tomography. Coded apertures can be made in a number of ways, each method having an impact on system background and signal contrast. Methods of constructing coded apertures for structuring X-ray illumination and scatter are compared and analyzed. Apertures can create structured X-ray bundles that investigate specific sets of object voxels. The tailored bundles of rays form a code (or pattern) and are later estimated through computational inversion. Structured illumination can be used to subsample object voxels and make inversion feasible for low dose computed tomography (CT) systems, or it can be used to reduce background in limited angle CT systems. On the detection side, coded apertures modulate X-ray scatter signals to determine the position and radiance of scatter points. By forming object dependent projections in measurement space, coded apertures multiplex modulated scatter signals onto a detector. The multiplexed signals can be inverted with knowledge of the code pattern and system geometry. This work shows two systems capable of determining object position and type in a 2D plane, by illuminating objects with an X-ray `fan beam,' using coded apertures and compressive measurements. Scatter tomography can help identify materials in security and medicine that may be ambiguous with transmission tomography alone.

  15. Trellis coding with multidimensional QAM signal sets

    NASA Technical Reports Server (NTRS)

    Pietrobon, Steven S.; Costello, Daniel J.

    1993-01-01

    Trellis coding using multidimensional QAM signal sets is investigated. Finite-size 2D signal sets are presented that have minimum average energy, are 90-deg rotationally symmetric, and have from 16 to 1024 points. The best trellis codes using the finite 16-QAM signal set with two, four, six, and eight dimensions are found by computer search (the multidimensional signal set is constructed from the 2D signal set). The best moderate complexity trellis codes for infinite lattices with two, four, six, and eight dimensions are also found. The minimum free squared Euclidean distance and number of nearest neighbors for these codes were used as the selection criteria. Many of the multidimensional codes are fully rotationally invariant and give asymptotic coding gains up to 6.0 dB. From the infinite lattice codes, the best codes for transmitting J, J + 1/4, J + 1/3, J + 1/2, J + 2/3, and J + 3/4 bit/sym (J an integer) are presented.

  16. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  17. Parallel CARLOS-3D code development

    SciTech Connect

    Putnam, J.M.; Kotulski, J.D.

    1996-02-01

    CARLOS-3D is a three-dimensional scattering code which was developed under the sponsorship of the Electromagnetic Code Consortium, and is currently used by over 80 aerospace companies and government agencies. The code has been extensively validated and runs on both serial workstations and parallel super computers such as the Intel Paragon. CARLOS-3D is a three-dimensional surface integral equation scattering code based on a Galerkin method of moments formulation employing Rao- Wilton-Glisson roof-top basis for triangular faceted surfaces. Fully arbitrary 3D geometries composed of multiple conducting and homogeneous bulk dielectric materials can be modeled. This presentation describes some of the extensions to the CARLOS-3D code, and how the operator structure of the code facilitated these improvements. Body of revolution (BOR) and two-dimensional geometries were incorporated by simply including new input routines, and the appropriate Galerkin matrix operator routines. Some additional modifications were required in the combined field integral equation matrix generation routine due to the symmetric nature of the BOR and 2D operators. Quadrilateral patched surfaces with linear roof-top basis functions were also implemented in the same manner. Quadrilateral facets and triangular facets can be used in combination to more efficiently model geometries with both large smooth surfaces and surfaces with fine detail such as gaps and cracks. Since the parallel implementation in CARLOS-3D is at high level, these changes were independent of the computer platform being used. This approach minimizes code maintenance, while providing capabilities with little additional effort. Results are presented showing the performance and accuracy of the code for some large scattering problems. Comparisons between triangular faceted and quadrilateral faceted geometry representations will be shown for some complex scatterers.

  18. Region-based fractal video coding

    NASA Astrophysics Data System (ADS)

    Zhu, Shiping; Belloulata, Kamel

    2008-10-01

    A novel video sequence compression scheme is proposed in order to realize the efficient and economical transmission of video sequence, and also the region-based functionality of MPEG-4. The CPM and NCIM fractal coding scheme is applied on each region independently by a prior image segmentation map (alpha plane) which is exactly the same as defined in MPEG-4. The first n frames of video sequence are encoded as a "set" using the Circular Prediction Mapping (CPM) and encode the remaining frames using the Non Contractive Interframe Mapping (NCIM). The CPM and NCIM accomplish the motion estimation and compensation, which can exploit the high temporal correlations between the adjacent frames of video sequence. The experimental results with the monocular video sequences provide promising performances at low bit rate coding, such as the application in video conference. We believe the proposed fractal video codec will be a powerful and efficient technique for the region-based video sequence coding.

  19. Blockiness in JPEG-coded images

    NASA Astrophysics Data System (ADS)

    Meesters, Lydia; Martens, Jean-Bernard

    1999-05-01

    In two experiments, dissimilarity data and numerical scaling data were obtained to determine the underlying attributes of image quality in baseline sequential JPEG coded imags. Although several distortions were perceived, i.e., blockiness, ringing and blur, the subjective data for all attributes where highly correlated, so that image quality could approximately be described by one independent attribute. We therefore proceeded by developing an instrumental measure for one of these distortions, i.e., blockiness. In this paper a single-ended blockiness measure is proposed, i.e., one that uses only the coded image. Our approach is therefore fundamentally different from most image quality models that use both the original and the degraded image. The measure is based on detecting the low- amplitude edges that result from blocking and estimating the amplitudes. Because of the approximate 1D of the underlying psychological space, the proposed blockiness measure also predicts the image quality of sequential baseline coded JPEG images.

  20. Robust Nonlinear Neural Codes

    NASA Astrophysics Data System (ADS)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  1. Honor Codes and Other Contextual Influences on Academic Integrity: A Replication and Extension to Modified Honor Code Settings.

    ERIC Educational Resources Information Center

    McCabe, Donald L.; Trevino, Linda Klebe; Butterfield, Kenneth D.

    2002-01-01

    Investigated the influence of modified honor codes, an alternative to traditional codes that is gaining popularity on larger campuses. Also tested the model of student academic dishonesty previously suggested by McCabe and Trevino. Found that modified honor codes are associated with lower levels of student dishonesty and that the McCabe Trevino…

  2. Multi-level bandwidth efficient block modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1989-01-01

    The multilevel technique is investigated for combining block coding and modulation. There are four parts. In the first part, a formulation is presented for signal sets on which modulation codes are to be constructed. Distance measures on a signal set are defined and their properties are developed. In the second part, a general formulation is presented for multilevel modulation codes in terms of component codes with appropriate Euclidean distances. The distance properties, Euclidean weight distribution and linear structure of multilevel modulation codes are investigated. In the third part, several specific methods for constructing multilevel block modulation codes with interdependency among component codes are proposed. Given a multilevel block modulation code C with no interdependency among the binary component codes, the proposed methods give a multilevel block modulation code C which has the same rate as C, a minimum squared Euclidean distance not less than that of code C, a trellis diagram with the same number of states as that of C and a smaller number of nearest neighbor codewords than that of C. In the last part, error performance of block modulation codes is analyzed for an AWGN channel based on soft-decision maximum likelihood decoding. Error probabilities of some specific codes are evaluated based on their Euclidean weight distributions and simulation results.

  3. Applications of numerical codes to space plasma problems

    NASA Technical Reports Server (NTRS)

    Northrop, T. G.; Birmingham, T. J.; Jones, F. C.; Wu, C. S.

    1975-01-01

    Solar wind, earth's bowshock, and magnetospheric convection and substorms were investigated. Topics discussed include computational physics, multifluid codes, ionospheric irregularities, and modeling laser plasmas.

  4. A robust low-rate coding scheme for packet video

    NASA Technical Reports Server (NTRS)

    Chen, Y. C.; Sayood, Khalid; Nelson, D. J.; Arikan, E. (Editor)

    1991-01-01

    Due to the rapidly evolving field of image processing and networking, video information promises to be an important part of telecommunication systems. Although up to now video transmission has been transported mainly over circuit-switched networks, it is likely that packet-switched networks will dominate the communication world in the near future. Asynchronous transfer mode (ATM) techniques in broadband-ISDN can provide a flexible, independent and high performance environment for video communication. For this paper, the network simulator was used only as a channel in this simulation. Mixture blocking coding with progressive transmission (MBCPT) has been investigated for use over packet networks and has been found to provide high compression rate with good visual performance, robustness to packet loss, tractable integration with network mechanics and simplicity in parallel implementation.

  5. A cascaded coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Shu, L.; Kasami, T.

    1985-01-01

    A cascade coding scheme for error control is investigated. The scheme employs a combination of hard and soft decisions in decoding. Error performance is analyzed. If the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit-error-rate. Some example schemes are evaluated. They seem to be quite suitable for satellite down-link error control.

  6. Cary Potter on Independent Education

    ERIC Educational Resources Information Center

    Potter, Cary

    1978-01-01

    Cary Potter was President of the National Association of Independent Schools from 1964-1978. As he leaves NAIS he gives his views on education, on independence, on the independent school, on public responsibility, on choice in a free society, on educational change, and on the need for collective action by independent schools. (Author/RK)

  7. Myth or Truth: Independence Day.

    ERIC Educational Resources Information Center

    Gardner, Traci

    Most Americans think of the Fourth of July as Independence Day, but is it really the day the U.S. declared and celebrated independence? By exploring myths and truths surrounding Independence Day, this lesson asks students to think critically about commonly believed stories regarding the beginning of the Revolutionary War and the Independence Day…

  8. Population coding of affect across stimuli, modalities and individuals

    PubMed Central

    Chikazoe, Junichi; Lee, Daniel H.; Kriegeskorte, Nikolaus; Anderson, Adam K.

    2014-01-01

    It remains unclear how the brain represents external objective sensory events alongside our internal subjective impressions of them—affect. Representational mapping of population level activity evoked by complex scenes and basic tastes uncovered a neural code supporting a continuous axis of pleasant-to-unpleasant valence. This valence code was distinct from low-level physical and high-level object properties. While ventral temporal and anterior insular cortices supported valence codes specific to vision and taste, both the medial and lateral orbitofrontal cortices (OFC), maintained a valence code independent of sensory origin. Further only the OFC code could classify experienced affect across participants. The entire valence spectrum is represented as a collective pattern in regional neural activity as sensory-specific and abstract codes, whereby the subjective quality of affect can be objectively quantified across stimuli, modalities, and people. PMID:24952643

  9. Methodology, status and plans for development and assessment of the code ATHLET

    SciTech Connect

    Teschendorff, V.; Austregesilo, H.; Lerchl, G.

    1997-07-01

    The thermal-hydraulic computer code ATHLET (Analysis of THermal-hydraulics of LEaks and Transients) is being developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) for the analysis of anticipated and abnormal plant transients, small and intermediate leaks as well as large breaks in light water reactors. The aim of the code development is to cover the whole spectrum of design basis and beyond design basis accidents (without core degradation) for PWRs and BWRs with only one code. The main code features are: advanced thermal-hydraulics; modular code architecture; separation between physical models and numerical methods; pre- and post-processing tools; portability. The code has features that are of special interest for applications to small leaks and transients with accident management, e.g. initialization by a steady-state calculation, full-range drift-flux model, dynamic mixture level tracking. The General Control Simulation Module of ATHLET is a flexible tool for the simulation of the balance-of-plant and control systems including the various operator actions in the course of accident sequences with AM measures. The code development is accompained by a systematic and comprehensive validation program. A large number of integral experiments and separate effect tests, including the major International Standard Problems, have been calculated by GRS and by independent organizations. The ATHLET validation matrix is a well balanced set of integral and separate effects tests derived from the CSNI proposal emphasizing, however, the German combined ECC injection system which was investigated in the UPTF, PKL and LOBI test facilities.

  10. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  11. Error coding simulations

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1993-01-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  12. Coded source neutron imaging

    NASA Astrophysics Data System (ADS)

    Bingham, Philip; Santos-Villalobos, Hector; Tobin, Ken

    2011-03-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100μm) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100μm and 10μm aperture hole diameters show resolutions matching the hole diameters.

  13. The design of wavefront coded imaging system

    NASA Astrophysics Data System (ADS)

    Lan, Shun; Cen, Zhaofeng; Li, Xiaotong

    2016-10-01

    Wavefront Coding is a new method to extend the depth of field, which combines optical design and signal processing together. By using optical design software ZEMAX ,we designed a practical wavefront coded imaging system based on a conventional Cooke triplet system .Unlike conventional optical system, the wavefront of this new system is modulated by a specially designed phase mask, which makes the point spread function (PSF)of optical system not sensitive to defocus. Therefore, a series of same blurred images obtained at the image plane. In addition, the optical transfer function (OTF) of the wavefront coded imaging system is independent of focus, which is nearly constant with misfocus and has no regions of zeros. All object information can be completely recovered through digital filtering at different defocus positions. The focus invariance of MTF is selected as merit function in this design. And the coefficients of phase mask are set as optimization goals. Compared to conventional optical system, wavefront coded imaging system obtains better quality images under different object distances. Some deficiencies appear in the restored images due to the influence of digital filtering algorithm, which are also analyzed in this paper. The depth of field of the designed wavefront coded imaging system is about 28 times larger than initial optical system, while keeping higher optical power and resolution at the image plane.

  14. Code query by example

    NASA Astrophysics Data System (ADS)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  15. SAC: Sheffield Advanced Code

    NASA Astrophysics Data System (ADS)

    Griffiths, Mike; Fedun, Viktor; Mumford, Stuart; Gent, Frederick

    2013-06-01

    The Sheffield Advanced Code (SAC) is a fully non-linear MHD code designed for simulations of linear and non-linear wave propagation in gravitationally strongly stratified magnetized plasma. It was developed primarily for the forward modelling of helioseismological processes and for the coupling processes in the solar interior, photosphere, and corona; it is built on the well-known VAC platform that allows robust simulation of the macroscopic processes in gravitationally stratified (non-)magnetized plasmas. The code has no limitations of simulation length in time imposed by complications originating from the upper boundary, nor does it require implementation of special procedures to treat the upper boundaries. SAC inherited its modular structure from VAC, thereby allowing modification to easily add new physics.

  16. Autocatalysis, information and coding.

    PubMed

    Wills, P R

    2001-01-01

    Autocatalytic self-construction in macromolecular systems requires the existence of a reflexive relationship between structural components and the functional operations they perform to synthesise themselves. The possibility of reflexivity depends on formal, semiotic features of the catalytic structure-function relationship, that is, the embedding of catalytic functions in the space of polymeric structures. Reflexivity is a semiotic property of some genetic sequences. Such sequences may serve as the basis for the evolution of coding as a result of autocatalytic self-organisation in a population of assignment catalysts. Autocatalytic selection is a mechanism whereby matter becomes differentiated in primitive biochemical systems. In the case of coding self-organisation, it corresponds to the creation of symbolic information. Prions are present-day entities whose replication through autocatalysis reflects aspects of biological semiotics less obvious than genetic coding.

  17. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option

  18. Temporal Coding of Volumetric Imagery

    NASA Astrophysics Data System (ADS)

    Llull, Patrick Ryan

    'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration

  19. Polar Code Validation

    DTIC Science & Technology

    1989-09-30

    Unclassified 2a SECURITY CLASSiF-ICATiON AUTHORIT’Y 3 DIStRIBUTION AVAILABILITY OF REPORT N,A Approved for public release; 2o DECLASSIFICAIiON DOWNGRADING SCH DI...SUMMARY OF POLAR ACHIEVEMENTS ..... .......... 3 3 . POLAR CODE PHYSICAL MODELS ..... ............. 5 3.1 PL-ASMA Su ^"ru5 I1LS SH A...11 Structure of the Bipolar Plasma Sheath Generated by SPEAR I ... ...... 1 3 The POLAR Code Wake Model: Comparison with in Situ Observations . . 23

  20. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor

  1. Securing mobile code.

    SciTech Connect

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements

  2. Identifying personal microbiomes using metagenomic codes.

    PubMed

    Franzosa, Eric A; Huang, Katherine; Meadow, James F; Gevers, Dirk; Lemon, Katherine P; Bohannan, Brendan J M; Huttenhower, Curtis

    2015-06-02

    Community composition within the human microbiome varies across individuals, but it remains unknown if this variation is sufficient to uniquely identify individuals within large populations or stable enough to identify them over time. We investigated this by developing a hitting set-based coding algorithm and applying it to the Human Microbiome Project population. Our approach defined body site-specific metagenomic codes: sets of microbial taxa or genes prioritized to uniquely and stably identify individuals. Codes capturing strain variation in clade-specific marker genes were able to distinguish among 100s of individuals at an initial sampling time point. In comparisons with follow-up samples collected 30-300 d later, ∼30% of individuals could still be uniquely pinpointed using metagenomic codes from a typical body site; coincidental (false positive) matches were rare. Codes based on the gut microbiome were exceptionally stable and pinpointed >80% of individuals. The failure of a code to match its owner at a later time point was largely explained by the loss of specific microbial strains (at current limits of detection) and was only weakly associated with the length of the sampling interval. In addition to highlighting patterns of temporal variation in the ecology of the human microbiome, this work demonstrates the feasibility of microbiome-based identifiability-a result with important ethical implications for microbiome study design. The datasets and code used in this work are available for download from huttenhower.sph.harvard.edu/idability.

  3. Modular optimization code package: MOZAIK

    NASA Astrophysics Data System (ADS)

    Bekar, Kursat B.

    This dissertation addresses the development of a modular optimization code package, MOZAIK, for geometric shape optimization problems in nuclear engineering applications. MOZAIK's first mission, determining the optimal shape of the D2O moderator tank for the current and new beam tube configurations for the Penn State Breazeale Reactor's (PSBR) beam port facility, is used to demonstrate its capabilities and test its performance. MOZAIK was designed as a modular optimization sequence including three primary independent modules: the initializer, the physics and the optimizer, each having a specific task. By using fixed interface blocks among the modules, the code attains its two most important characteristics: generic form and modularity. The benefit of this modular structure is that the contents of the modules can be switched depending on the requirements of accuracy, computational efficiency, or compatibility with the other modules. Oak Ridge National Laboratory's discrete ordinates transport code TORT was selected as the transport solver in the physics module of MOZAIK, and two different optimizers, Min-max and Genetic Algorithms (GA), were implemented in the optimizer module of the code package. A distributed memory parallelism was also applied to MOZAIK via MPI (Message Passing Interface) to execute the physics module concurrently on a number of processors for various states in the same search. Moreover, dynamic scheduling was enabled to enhance load balance among the processors while running MOZAIK's physics module thus improving the parallel speedup and efficiency. In this way, the total computation time consumed by the physics module is reduced by a factor close to M, where M is the number of processors. This capability also encourages the use of MOZAIK for shape optimization problems in nuclear applications because many traditional codes related to radiation transport do not have parallel execution capability. A set of computational models based on the

  4. Coding capacity of complementary DNA strands.

    PubMed Central

    Casino, A; Cipollaro, M; Guerrini, A M; Mastrocinque, G; Spena, A; Scarlato, V

    1981-01-01

    A Fortran computer algorithm has been used to analyze the nucleotide sequence of several structural genes. The analysis performed on both coding and complementary DNA strands shows that whereas open reading frames shorter than 100 codons are randomly distributed on both DNA strands, open reading frames longer than 100 codons ("virtual genes") are significantly more frequent on the complementary DNA strand than on the coding one. These "virtual genes" were further investigated by looking at intron sequences, splicing points, signal sequences and by analyzing gene mutations. On the basis of this analysis coding and complementary DNA strands of several eukaryotic structural genes cannot be distinguished. In particular we suggest that the complementary DNA strand of the human epsilon-globin gene might indeed code for a protein. PMID:7015290

  5. Numerical MHD codes for modeling astrophysical flows

    NASA Astrophysics Data System (ADS)

    Koldoba, A. V.; Ustyugova, G. V.; Lii, P. S.; Comins, M. L.; Dyda, S.; Romanova, M. M.; Lovelace, R. V. E.

    2016-05-01

    We describe a Godunov-type magnetohydrodynamic (MHD) code based on the Miyoshi and Kusano (2005) solver which can be used to solve various astrophysical hydrodynamic and MHD problems. The energy equation is in the form of entropy conservation. The code has been implemented on several different coordinate systems: 2.5D axisymmetric cylindrical coordinates, 2D Cartesian coordinates, 2D plane polar coordinates, and fully 3D cylindrical coordinates. Viscosity and diffusivity are implemented in the code to control the accretion rate in the disk and the rate of penetration of the disk matter through the magnetic field lines. The code has been utilized for the numerical investigations of a number of different astrophysical problems, several examples of which are shown.

  6. Perceptually-Based Adaptive JPEG Coding

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Rosenholtz, Ruth; Null, Cynthia H. (Technical Monitor)

    1996-01-01

    An extension to the JPEG standard (ISO/IEC DIS 10918-3) allows spatial adaptive coding of still images. As with baseline JPEG coding, one quantization matrix applies to an entire image channel, but in addition the user may specify a multiplier for each 8 x 8 block, which multiplies the quantization matrix, yielding the new matrix for the block. MPEG 1 and 2 use much the same scheme, except there the multiplier changes only on macroblock boundaries. We propose a method for perceptual optimization of the set of multipliers. We compute the perceptual error for each block based upon DCT quantization error adjusted according to contrast sensitivity, light adaptation, and contrast masking, and pick the set of multipliers which yield maximally flat perceptual error over the blocks of the image. We investigate the bitrate savings due to this adaptive coding scheme and the relative importance of the different sorts of masking on adaptive coding.

  7. Coding for urologic office procedures.

    PubMed

    Dowling, Robert A; Painter, Mark

    2013-11-01

    This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff.

  8. Accumulate Repeat Accumulate Coded Modulation

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

  9. Dress Codes. Legal Brief.

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2000-01-01

    As illustrated by two recent decisions, the courts in the past decade have demarcated wide boundaries for school officials considering dress codes, whether in the form of selective prohibitions or required uniforms. Administrators must warn the community, provide legitimate justification and reasonable clarity, and comply with state law. (MLH)

  10. Dress Codes and Uniforms.

    ERIC Educational Resources Information Center

    Lumsden, Linda; Miller, Gabriel

    2002-01-01

    Students do not always make choices that adults agree with in their choice of school dress. Dress-code issues are explored in this Research Roundup, and guidance is offered to principals seeking to maintain a positive school climate. In "Do School Uniforms Fit?" Kerry White discusses arguments for and against school uniforms and summarizes the…

  11. Building Codes and Regulations.

    ERIC Educational Resources Information Center

    Fisher, John L.

    The hazard of fire is of great concern to libraries due to combustible books and new plastics used in construction and interiors. Building codes and standards can offer architects and planners guidelines to follow but these standards should be closely monitored, updated, and researched for fire prevention. (DS)

  12. Student Dress Codes.

    ERIC Educational Resources Information Center

    Uerling, Donald F.

    School officials see a need for regulations that prohibit disruptive and inappropriate forms of expression and attire; students see these regulations as unwanted restrictions on their freedom. This paper reviews court litigation involving constitutional limitations on school authority, dress and hair codes, state law constraints, and school…

  13. Video Coding for ESL.

    ERIC Educational Resources Information Center

    King, Kevin

    1992-01-01

    Coding tasks, a valuable technique for teaching English as a Second Language, are presented that enable students to look at patterns and structures of marital communication as well as objectively evaluate the degree of happiness or distress in the marriage. (seven references) (JL)

  14. Electrical Circuit Simulation Code

    SciTech Connect

    Wix, Steven D.; Waters, Arlon J.; Shirley, David

    2001-08-09

    Massively-Parallel Electrical Circuit Simulation Code. CHILESPICE is a massively-arallel distributed-memory electrical circuit simulation tool that contains many enhanced radiation, time-based, and thermal features and models. Large scale electronic circuit simulation. Shared memory, parallel processing, enhance convergence. Sandia specific device models.

  15. Multiple trellis coded modulation

    NASA Technical Reports Server (NTRS)

    Simon, Marvin K. (Inventor); Divsalar, Dariush (Inventor)

    1990-01-01

    A technique for designing trellis codes to minimize bit error performance for a fading channel. The invention provides a criteria which may be used in the design of such codes which is significantly different from that used for average white Gaussian noise channels. The method of multiple trellis coded modulation of the present invention comprises the steps of: (a) coding b bits of input data into s intermediate outputs; (b) grouping said s intermediate outputs into k groups of s.sub.i intermediate outputs each where the summation of all s.sub.i,s is equal to s and k is equal to at least 2; (c) mapping each of said k groups of intermediate outputs into one of a plurality of symbols in accordance with a plurality of modulation schemes, one for each group such that the first group is mapped in accordance with a first modulation scheme and the second group is mapped in accordance with a second modulation scheme; and (d) outputting each of said symbols to provide k output symbols for each b bits of input data.

  16. A Pseudorandom Code Modulated LIDAR

    NASA Astrophysics Data System (ADS)

    Hunt, K. P.; Eichinger, W. E.; Kruger, A.

    2009-12-01

    Typical Light Detection and Ranging (LIDAR) uses high power pulsed lasers to ensure a detectable return signal. For short ranges, modulated diode lasers offer an attractive alternative, particularly in the areas of size, weight, cost, eye safety and use of energy. Flexible electronic modulation of the laser diode allows the development of pseudorandom code (PRC) LIDAR systems that can overcome the disadvantage of low output power and thus low signal to noise ratios. Different PRCs have been proposed. For example, so called M-sequences can be generated simply, but are unbalanced: they have more ones than zeros, which results in a residual noise component. Other sequences such as the A1 and A2 sequences are balanced, but have two autocorrelation peaks, resulting in undesirable pickup of signals from different ranges. In this work, we investigate a new code, an M-sequence with a zero added at the end. The result is still easily generated and has a single autocorrelation peak, but is now balanced. We loaded these sequences into a commercial arbitrary waveform generator (ARB), an Agilent 33250A, which then modulates the laser diode. This allows sequences to be changed quickly and easily, permitting us to design and investigate a wide range of PRC sequences with desirable properties. The ARB modulates a Melles Griot 56ICS near infrared laser diode at a 10 MHz chip rate. Backscatter is collected and focused by a telescope and the detected signal is sampled and correlated with the known PRC. We have gathered data from this LIDAR system and experimentally assessed the performance of this new class of codes.

  17. A Fast Sphere Decoding Algorithm for Space-Frequency Block Codes

    NASA Astrophysics Data System (ADS)

    Safar, Zoltan; Su, Weifeng; Liu, K. J. Ray

    2006-12-01

    The recently proposed space-frequency-coded MIMO-OFDM systems have promised considerable performance improvement over single-antenna systems. However, in order to make multiantenna OFDM systems an attractive choice for practical applications, implementation issues such as decoding complexity must be addressed successfully. In this paper, we propose a computationally efficient decoding algorithm for space-frequency block codes. The central part of the algorithm is a modulation-independent sphere decoding framework formulated in the complex domain. We develop three decoding approaches: a modulation-independent approach applicable to any memoryless modulation method, a QAM-specific and a PSK-specific fast decoding algorithm performing nearest-neighbor signal point search. The computational complexity of the algorithms is investigated via both analysis and simulation. The simulation results demonstrate that the proposed algorithm can significantly reduce the decoding complexity. We observe up to 75% reduction in the required FLOP count per code block compared to previously existing methods without noticeable performance degradation.

  18. Light curves for bump Cepheids computed with a dynamically zoned pulsation code

    NASA Astrophysics Data System (ADS)

    Adams, T. F.; Castor, J. I.; Davis, C. G.

    1980-05-01

    The dynamically zoned pulsation code developed by Castor, Davis, and Davison was used to recalculate the Goddard model and to calculate three other Cepheid models with the same period (9.8 days). This family of models shows how the bumps and other features of the light and velocity curves change as the mass is varied at constant period. The use of a code that is capable of producing reliable light curves demonstrates that the light and velocity curves for 9.8 day Cepheid models with standard homogeneous compositions do not show bumps like those that are observed unless the mass is significantly lower than the 'evolutionary mass.' The light and velocity curves for the Goddard model presented here are similar to those computed independently by Fischel, Sparks, and Karp. They should be useful as standards for future investigators.

  19. Light curves for bump Cepheids computed with a dynamically zoned pulsation code

    NASA Technical Reports Server (NTRS)

    Adams, T. F.; Castor, J. I.; Davis, C. G.

    1980-01-01

    The dynamically zoned pulsation code developed by Castor, Davis, and Davison was used to recalculate the Goddard model and to calculate three other Cepheid models with the same period (9.8 days). This family of models shows how the bumps and other features of the light and velocity curves change as the mass is varied at constant period. The use of a code that is capable of producing reliable light curves demonstrates that the light and velocity curves for 9.8 day Cepheid models with standard homogeneous compositions do not show bumps like those that are observed unless the mass is significantly lower than the 'evolutionary mass.' The light and velocity curves for the Goddard model presented here are similar to those computed independently by Fischel, Sparks, and Karp. They should be useful as standards for future investigators.

  20. Coding Theory and Projective Spaces

    NASA Astrophysics Data System (ADS)

    Silberstein, Natalia

    2008-05-01

    The projective space of order n over a finite field F_q is a set of all subspaces of the vector space F_q^{n}. In this work, we consider error-correcting codes in the projective space, focusing mainly on constant dimension codes. We start with the different representations of subspaces in the projective space. These representations involve matrices in reduced row echelon form, associated binary vectors, and Ferrers diagrams. Based on these representations, we provide a new formula for the computation of the distance between any two subspaces in the projective space. We examine lifted maximum rank distance (MRD) codes, which are nearly optimal constant dimension codes. We prove that a lifted MRD code can be represented in such a way that it forms a block design known as a transversal design. The incidence matrix of the transversal design derived from a lifted MRD code can be viewed as a parity-check matrix of a linear code in the Hamming space. We find the properties of these codes which can be viewed also as LDPC codes. We present new bounds and constructions for constant dimension codes. First, we present a multilevel construction for constant dimension codes, which can be viewed as a generalization of a lifted MRD codes construction. This construction is based on a new type of rank-metric codes, called Ferrers diagram rank-metric codes. Then we derive upper bounds on the size of constant dimension codes which contain the lifted MRD code, and provide a construction for two families of codes, that attain these upper bounds. We generalize the well-known concept of a punctured code for a code in the projective space to obtain large codes which are not constant dimension. We present efficient enumerative encoding and decoding techniques for the Grassmannian. Finally we describe a search method for constant dimension lexicodes.

  1. Automated searching for quantum subsystem codes

    SciTech Connect

    Crosswhite, Gregory M.; Bacon, Dave

    2011-02-15

    Quantum error correction allows for faulty quantum systems to behave in an effectively error-free manner. One important class of techniques for quantum error correction is the class of quantum subsystem codes, which are relevant both to active quantum error-correcting schemes as well as to the design of self-correcting quantum memories. Previous approaches for investigating these codes have focused on applying theoretical analysis to look for interesting codes and to investigate their properties. In this paper we present an alternative approach that uses computational analysis to accomplish the same goals. Specifically, we present an algorithm that computes the optimal quantum subsystem code that can be implemented given an arbitrary set of measurement operators that are tensor products of Pauli operators. We then demonstrate the utility of this algorithm by performing a systematic investigation of the quantum subsystem codes that exist in the setting where the interactions are limited to two-body interactions between neighbors on lattices derived from the convex uniform tilings of the plane.

  2. Clinical Reasoning of Physical Therapists regarding In-hospital Walking Independence of Patients with Hemiplegia

    PubMed Central

    Takahashi, Junpei; Takami, Akiyoshi; Wakayama, Saichi

    2014-01-01

    [Purpose] Physical therapists must often determine whether hemiparetic patients can walk independently. However, there are no criteria, so decisions are often left to individual physical therapists. The purpose of this study was to explore how physical therapists determine whether a patient with hemiplegia can walk independently in a ward. [Methods] The subjects were 15 physical therapists with experience of stroke patients’ rehabilitation. We interviewed them using semi-structured interviews related to the criteria of the states of walking in the ward of hemiparetic patients. The interviews were transcribed in full, and the texts were analyzed by coding and grouping. [Results] From the results of the interviews, PTs determined patients’ independence of walking in hospital by observation of behavior during walking or treatment. The majority of PTs focused on the patients’ state during walking, higher brain function, and their ability to balance. In addition, they often asked ward staff about patients’ daily life, and self-determination. [Conclusions] We identified the items examined by physical therapists when determining the in-hospital walking independence of stroke patients. Further investigation is required to examine which of these items are truly necessary. PMID:24926149

  3. Binary coding for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Chang, Chein-I.; Chang, Chein-Chi; Lin, Chinsu

    2004-10-01

    Binary coding is one of simplest ways to characterize spectral features. One commonly used method is a binary coding-based image software system, called Spectral Analysis Manager (SPAM) for remotely sensed imagery developed by Mazer et al. For a given spectral signature, the SPAM calculates its spectral mean and inter-band spectral difference and uses them as thresholds to generate a binary code word for this particular spectral signature. Such coding scheme is generally effective and also very simple to implement. This paper revisits the SPAM and further develops three new SPAM-based binary coding methods, called equal probability partition (EPP) binary coding, halfway partition (HP) binary coding and median partition (MP) binary coding. These three binary coding methods along with the SPAM well be evaluated for spectral discrimination and identification. In doing so, a new criterion, called a posteriori discrimination probability (APDP) is also introduced for performance measure.

  4. Sinusoidal transform coding

    NASA Technical Reports Server (NTRS)

    Mcaulay, Robert J.; Quatieri, Thomas F.

    1988-01-01

    It has been shown that an analysis/synthesis system based on a sinusoidal representation of speech leads to synthetic speech that is essentially perceptually indistinguishable from the original. Strategies for coding the amplitudes, frequencies and phases of the sine waves have been developed that have led to a multirate coder operating at rates from 2400 to 9600 bps. The encoded speech is highly intelligible at all rates with a uniformly improving quality as the data rate is increased. A real-time fixed-point implementation has been developed using two ADSP2100 DSP chips. The methods used for coding and quantizing the sine-wave parameters for operation at the various frame rates are described.

  5. Confocal coded aperture imaging

    DOEpatents

    Tobin, Jr., Kenneth William; Thomas, Jr., Clarence E.

    2001-01-01

    A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

  6. Two-dimensional MHD generator model. [GEN code

    SciTech Connect

    Geyer, H. K.; Ahluwalia, R. K.; Doss, E. D.

    1980-09-01

    A steady state, two-dimensional MHD generator code, GEN, is presented. The code solves the equations of conservation of mass, momentum, and energy, using a Von Mises transformation and a local linearization of the equations. By splitting the source terms into a part proportional to the axial pressure gradient and a part independent of the gradient, the pressure distribution along the channel is easily obtained to satisfy various criteria. Thus, the code can run effectively in both design modes, where the channel geometry is determined, and analysis modes, where the geometry is previously known. The code also employs a mixing length concept for turbulent flows, Cebeci and Chang's wall roughness model, and an extension of that model to the effective thermal diffusities. Results on code validation, as well as comparisons of skin friction and Stanton number calculations with experimental results, are presented.

  7. Error resiliency of distributed video coding in wireless video communication

    NASA Astrophysics Data System (ADS)

    Ye, Shuiming; Ouaret, Mourad; Dufaux, Frederic; Ansorge, Michael; Ebrahimi, Touradj

    2008-08-01

    Distributed Video Coding (DVC) is a new paradigm in video coding, based on the Slepian-Wolf and Wyner-Ziv theorems. DVC offers a number of potential advantages: flexible partitioning of the complexity between the encoder and decoder, robustness to channel errors due to intrinsic joint source-channel coding, codec independent scalability, and multi-view coding without communications between the cameras. In this paper, we evaluate the performance of DVC in an error-prone wireless communication environment. We also present a hybrid spatial and temporal error concealment approach for DVC. Finally, we perform a comparison with a state-of-the-art AVC/H.264 video coding scheme in the presence of transmission errors.

  8. An international survey of building energy codes and their implementation

    DOE PAGES

    Evans, Meredydd; Roshchanka, Volha; Graham, Peter

    2017-01-03

    Buildings are key to low-carbon development everywhere, and many countries have introduced building energy codes to improve energy efficiency in buildings. Yet, building energy codes can only deliver results when the codes are implemented. For this reason, studies of building energy codes need to consider implementation of building energy codes in a consistent and comprehensive way. This research identifies elements and practices in implementing building energy codes, covering codes in 22 countries that account for 70% of global energy use in buildings. These elements and practices include: comprehensive coverage of buildings by type, age, size, and geographic location; an implementationmore » framework that involves a certified agency to inspect construction at critical stages; and building materials that are independently tested, rated, and labeled. Training and supporting tools are another element of successful code implementation. Some countries have also introduced compliance evaluation studies, which suggested that tightening energy requirements would only be meaningful when also addressing gaps in implementation (Pitt&Sherry, 2014; U.S. DOE, 2016b). Here, this article provides examples of practices that countries have adopted to assist with implementation of building energy codes.« less

  9. HYCOM Code Development

    DTIC Science & Technology

    2003-02-10

    HYCOM code development Alan J. Wallcraft Naval Research Laboratory 2003 Layered Ocean Model Users’ Workshop February 10, 2003 Report Documentation...unlimited 13. SUPPLEMENTARY NOTES Layered Ocean Modeling Workshop (LOM 2003), Miami, FL, Feb 2003 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY...Kraus-Turner mixed-layer Æ Energy-Loan (passive) ice model Æ High frequency atmospheric forcing Æ New I/O scheme (.a and .b files) Æ Scalability via

  10. Reeds computer code

    NASA Technical Reports Server (NTRS)

    Bjork, C.

    1981-01-01

    The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO.

  11. Trajectory Code Studies, 1987

    SciTech Connect

    Poukey, J.W.

    1988-01-01

    The trajectory code TRAJ has been used extensively to study nonimmersed foilless electron diodes. The basic goal of the research is to design low-emittance injectors for electron linacs and propagation experiments. Systems studied during 1987 include Delphi, Recirc, and Troll. We also discuss a partly successful attempt to extend the same techniques to high currents (tens of kA). 7 refs., 30 figs.

  12. The PHARO Code.

    DTIC Science & Technology

    1981-11-24

    n.cet..ary ad Identfy by block nutrb.) Visible radiation Sensors Infrared radiation Line and band transitions Isophots High altitude nuclear data...radiation (watts sr) in arbitrary wavelength intervals is determined. The results are a series of " isophot " plots for rbitrariiy placed cameras or sensors...Section II. The output of the PHARO code consists of contour plots of radiative intensity (watts/cm ster) or " isophot " plots for arbitrarily placed sensors

  13. The Phantom SPH code

    NASA Astrophysics Data System (ADS)

    Price, Daniel; Wurster, James; Nixon, Chris

    2016-05-01

    I will present the capabilities of the Phantom SPH code for global simulations of dust and gas in protoplanetary discs. I will present our new algorithms for simulating both small and large grains in discs, as well as our progress towards simulating evolving grain populations and coupling with radiation. Finally, I will discuss our recent applications to HL Tau and the physics of dust gap opening.

  14. Status of MARS Code

    SciTech Connect

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  15. The Comparative Performance of Conditional Independence Indices

    ERIC Educational Resources Information Center

    Kim, Doyoung; De Ayala, R. J.; Ferdous, Abdullah A.; Nering, Michael L.

    2011-01-01

    To realize the benefits of item response theory (IRT), one must have model-data fit. One facet of a model-data fit investigation involves assessing the tenability of the conditional item independence (CII) assumption. In this Monte Carlo study, the comparative performance of 10 indices for identifying conditional item dependence is assessed. The…

  16. Orthopedics coding and funding.

    PubMed

    Baron, S; Duclos, C; Thoreux, P

    2014-02-01

    The French tarification à l'activité (T2A) prospective payment system is a financial system in which a health-care institution's resources are based on performed activity. Activity is described via the PMSI medical information system (programme de médicalisation du système d'information). The PMSI classifies hospital cases by clinical and economic categories known as diagnosis-related groups (DRG), each with an associated price tag. Coding a hospital case involves giving as realistic a description as possible so as to categorize it in the right DRG and thus ensure appropriate payment. For this, it is essential to understand what determines the pricing of inpatient stay: namely, the code for the surgical procedure, the patient's principal diagnosis (reason for admission), codes for comorbidities (everything that adds to management burden), and the management of the length of inpatient stay. The PMSI is used to analyze the institution's activity and dynamism: change on previous year, relation to target, and comparison with competing institutions based on indicators such as the mean length of stay performance indicator (MLS PI). The T2A system improves overall care efficiency. Quality of care, however, is not presently taken account of in the payment made to the institution, as there are no indicators for this; work needs to be done on this topic.

  17. MELCOR computer code manuals

    SciTech Connect

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  18. Bar coded retroreflective target

    DOEpatents

    Vann, Charles S.

    2000-01-01

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  19. Enforcing the International Code of Marketing of Breast-milk Substitutes for Better Promotion of Exclusive Breastfeeding: Can Lessons Be Learned?

    PubMed

    Barennes, Hubert; Slesak, Guenther; Goyet, Sophie; Aaron, Percy; Srour, Leila M

    2016-02-01

    Exclusive breastfeeding, one of the best natural resources, needs protection and promotion. The International Code of Marketing of Breast-milk Substitutes (the Code), which aims to prevent the undermining of breastfeeding by formula advertising, faces implementation challenges. We reviewed frequently overlooked challenges and obstacles that the Code is facing worldwide, but particularly in Southeast Asia. Drawing lessons from various countries where we work, and following the example of successful public health interventions, we discussed legislation, enforcement, and experiences that are needed to successfully implement the Code. Successful holistic approaches that have strengthened the Code need to be scaled up. Community-based actions and peer-to-peer promotions have proved successful. Legislation without stringent enforcement and sufficient penalties is ineffective. The public needs education about the benefits and ways and means to support breastfeeding. It is crucial to combine strong political commitment and leadership with strict national regulations, definitions, and enforcement. National breastfeeding committees, with the authority to improve regulations, investigate violations, and enforce the laws, must be established. Systematic monitoring and reporting are needed to identify companies, individuals, intermediaries, and practices that infringe on the Code. Penalizing violators is crucial. Managers of multinational companies must be held accountable for international violations, and international legislative enforcement needs to be established. Further measures should include improved regulations to protect the breastfeeding mother: large-scale education campaigns; strong penalties for Code violators; exclusion of the formula industry from nutrition, education, and policy roles; supportive legal networks; and independent research of interventions supporting breastfeeding.

  20. High-Fidelity Coding with Correlated Neurons

    PubMed Central

    da Silveira, Rava Azeredo; Berry, Michael J.

    2014-01-01

    Positive correlations in the activity of neurons are widely observed in the brain. Previous studies have shown these correlations to be detrimental to the fidelity of population codes, or at best marginally favorable compared to independent codes. Here, we show that positive correlations can enhance coding performance by astronomical factors. Specifically, the probability of discrimination error can be suppressed by many orders of magnitude. Likewise, the number of stimuli encoded—the capacity—can be enhanced more than tenfold. These effects do not necessitate unrealistic correlation values, and can occur for populations with a few tens of neurons. We further show that both effects benefit from heterogeneity commonly seen in population activity. Error suppression and capacity enhancement rest upon a pattern of correlation. Tuning of one or several effective parameters can yield a limit of perfect coding: the corresponding pattern of positive correlation leads to a ‘lock-in’ of response probabilities that eliminates variability in the subspace relevant for stimulus discrimination. We discuss the nature of this pattern and we suggest experimental tests to identify it. PMID:25412463

  1. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  2. Construction of new quantum MDS codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Taneja, Divya; Gupta, Manish; Narula, Rajesh; Bhullar, Jaskaran

    Obtaining quantum maximum distance separable (MDS) codes from dual containing classical constacyclic codes using Hermitian construction have paved a path to undertake the challenges related to such constructions. Using the same technique, some new parameters of quantum MDS codes have been constructed here. One set of parameters obtained in this paper has achieved much larger distance than work done earlier. The remaining constructed parameters of quantum MDS codes have large minimum distance and were not explored yet.

  3. The TESS (Tandem Experiment Simulation Studies) computer code user's manual

    SciTech Connect

    Procassini, R.J. . Dept. of Nuclear Engineering); Cohen, B.I. )

    1990-06-01

    TESS (Tandem Experiment Simulation Studies) is a one-dimensional, bounded particle-in-cell (PIC) simulation code designed to investigate the confinement and transport of plasma in a magnetic mirror device, including tandem mirror configurations. Mirror plasmas may be modeled in a system which includes an applied magnetic field and/or a self-consistent or applied electrostatic potential. The PIC code TESS is similar to the PIC code DIPSI (Direct Implicit Plasma Surface Interactions) which is designed to study plasma transport to and interaction with a solid surface. The codes TESS and DIPSI are direct descendants of the PIC code ES1 that was created by A. B. Langdon. This document provides the user with a brief description of the methods used in the code and a tutorial on the use of the code. 10 refs., 2 tabs.

  4. Quantum error-correcting codes over mixed alphabets

    NASA Astrophysics Data System (ADS)

    Wang, Zhuo; Yu, Sixia; Fan, Heng; Oh, C. H.

    2013-08-01

    We study the quantum error-correcting codes over mixed alphabets to deal with a more complicated and practical situation in which the physical systems for encoding may have different numbers of energy levels. In particular we investigate their constructions and propose the theory of quantum Singleton bound. Two kinds of code constructions are presented: a projection-based construction for general case and a graphical construction based on a graph-theoretical object composite coding clique dealing with the case of reducible alphabets. We find out some optimal one-error correcting or detecting codes over two alphabets. Our method of composite coding clique also sheds light on constructing standard quantum error-correcting codes, and other families of optimal codes are found.

  5. Convolutional coding techniques for data protection

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  6. An Eye-Tracking Study of How Color Coding Affects Multimedia Learning

    ERIC Educational Resources Information Center

    Ozcelik, Erol; Karakus, Turkan; Kursun, Engin; Cagiltay, Kursat

    2009-01-01

    Color coding has been proposed to promote more effective learning. However, insufficient evidence currently exists to show how color coding leads to better learning. The goal of this study was to investigate the underlying cause of the color coding effect by utilizing eye movement data. Fifty-two participants studied either a color-coded or…

  7. Cracking the College Code.

    ERIC Educational Resources Information Center

    Radebaugh, Pauline

    1992-01-01

    The Columbus (Ohio) Public Schools, in cooperation with the Columbus Foundation and the Columbus Chamber of Commerce, developed a nonprofit, independently administered program (I Know I Can) to (1) encourage all students, especially the disadvantaged, to attend college; and (2) help them apply for college admission and financial aid. President…

  8. Combinatorial neural codes from a mathematical coding theory perspective.

    PubMed

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  9. New quantum MDS-convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Li, Fengwei; Yue, Qin

    2015-12-01

    In this paper, we utilize a family of Hermitian dual-containing constacyclic codes to construct classical and quantum MDS convolutional codes. Our classical and quantum convolutional codes are optimal in the sense that they attain the classical (quantum) generalized Singleton bound.

  10. A class of constacyclic BCH codes and new quantum codes

    NASA Astrophysics Data System (ADS)

    liu, Yang; Li, Ruihu; Lv, Liangdong; Ma, Yuena

    2017-03-01

    Constacyclic BCH codes have been widely studied in the literature and have been used to construct quantum codes in latest years. However, for the class of quantum codes of length n=q^{2m}+1 over F_{q^2} with q an odd prime power, there are only the ones of distance δ ≤ 2q^2 are obtained in the literature. In this paper, by a detailed analysis of properties of q2-ary cyclotomic cosets, maximum designed distance δ _{max} of a class of Hermitian dual-containing constacyclic BCH codes with length n=q^{2m}+1 are determined, this class of constacyclic codes has some characteristic analog to that of primitive BCH codes over F_{q^2}. Then we can obtain a sequence of dual-containing constacyclic codes of designed distances 2q^2<δ ≤ δ _{max}. Consequently, new quantum codes with distance d > 2q^2 can be constructed from these dual-containing codes via Hermitian Construction. These newly obtained quantum codes have better code rate compared with those constructed from primitive BCH codes.

  11. Development of the Code RITRACKS

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Cucinotta, Francis A.

    2013-01-01

    A document discusses the code RITRACKS (Relativistic Ion Tracks), which was developed to simulate heavy ion track structure at the microscopic and nanoscopic scales. It is a Monte-Carlo code that simulates the production of radiolytic species in water, event-by-event, and which may be used to simulate tracks and also to calculate dose in targets and voxels of different sizes. The dose deposited by the radiation can be calculated in nanovolumes (voxels). RITRACKS allows simulation of radiation tracks without the need of extensive knowledge of computer programming or Monte-Carlo simulations. It is installed as a regular application on Windows systems. The main input parameters entered by the user are the type and energy of the ion, the length and size of the irradiated volume, the number of ions impacting the volume, and the number of histories. The simulation can be started after the input parameters are entered in the GUI. The number of each kind of interactions for each track is shown in the result details window. The tracks can be visualized in 3D after the simulation is complete. It is also possible to see the time evolution of the tracks and zoom on specific parts of the tracks. The software RITRACKS can be very useful for radiation scientists to investigate various problems in the fields of radiation physics, radiation chemistry, and radiation biology. For example, it can be used to simulate electron ejection experiments (radiation physics).

  12. Neural Coding for Effective Rehabilitation

    PubMed Central

    2014-01-01

    Successful neurological rehabilitation depends on accurate diagnosis, effective treatment, and quantitative evaluation. Neural coding, a technology for interpretation of functional and structural information of the nervous system, has contributed to the advancements in neuroimaging, brain-machine interface (BMI), and design of training devices for rehabilitation purposes. In this review, we summarized the latest breakthroughs in neuroimaging from microscale to macroscale levels with potential diagnostic applications for rehabilitation. We also reviewed the achievements in electrocorticography (ECoG) coding with both animal models and human beings for BMI design, electromyography (EMG) interpretation for interaction with external robotic systems, and robot-assisted quantitative evaluation on the progress of rehabilitation programs. Future rehabilitation would be more home-based, automatic, and self-served by patients. Further investigations and breakthroughs are mainly needed in aspects of improving the computational efficiency in neuroimaging and multichannel ECoG by selection of localized neuroinformatics, validation of the effectiveness in BMI guided rehabilitation programs, and simplification of the system operation in training devices. PMID:25258708

  13. Decoding the LIM development code.

    PubMed

    Gill, Gordon N

    2003-01-01

    During development a vast number of distinct cell types arise from dividing progenitor cells. Concentration gradients of ligands that act via cell surface receptors signal transcriptional regulators that repress and activate particular genes. LIM homeodomain proteins are an important class of transcriptional regulators that direct cell fate. Although in C. elegans only a single LIM homeodomain protein is expressed in a particular cell type, in vertebrates combinations of LIM homeodomain proteins are expressed in cells that determine cell fates. We have investigated the molecular basis of the LIM domain "code" that determines cell fates such as wing formation in Drosophilia and motor neuron formation in chicks. The basic code is a homotetramer of 2 LIM homeodomain proteins bridged by the adaptor protein, nuclear LIM interactor (NLI). A more complex molecular language consisting of a hexamer complex involving NLI and 2 LIM homeodomain proteins, Lhx3 and Isl1 determines ventral motor neuron formation. The same molecular "words" adopt different meanings depending on the context of expression of other molecular "words."

  14. Neural Elements for Predictive Coding

    PubMed Central

    Shipp, Stewart

    2016-01-01

    Predictive coding theories of sensory brain function interpret the hierarchical construction of the cerebral cortex as a Bayesian, generative model capable of predicting the sensory data consistent with any given percept. Predictions are fed backward in the hierarchy and reciprocated by prediction error in the forward direction, acting to modify the representation of the outside world at increasing levels of abstraction, and so to optimize the nature of perception over a series of iterations. This accounts for many ‘illusory’ instances of perception where what is seen (heard, etc.) is unduly influenced by what is expected, based on past experience. This simple conception, the hierarchical exchange of prediction and prediction error, confronts a rich cortical microcircuitry that is yet to be fully documented. This article presents the view that, in the current state of theory and practice, it is profitable to begin a two-way exchange: that predictive coding theory can support an understanding of cortical microcircuit function, and prompt particular aspects of future investigation, whilst existing knowledge of microcircuitry can, in return, influence theoretical development. As an example, a neural inference arising from the earliest formulations of predictive coding is that the source populations of forward and backward pathways should be completely separate, given their functional distinction; this aspect of circuitry – that neurons with extrinsically bifurcating axons do not project in both directions – has only recently been confirmed. Here, the computational architecture prescribed by a generalized (free-energy) formulation of predictive coding is combined with the classic ‘canonical microcircuit’ and the laminar architecture of hierarchical extrinsic connectivity to produce a template schematic, that is further examined in the light of (a) updates in the microcircuitry of primate visual cortex, and (b) rapid technical advances made possible by

  15. A Comparative Study on Seismic Analysis of Bangladesh National Building Code (BNBC) with Other Building Codes

    NASA Astrophysics Data System (ADS)

    Bari, Md. S.; Das, T.

    2013-09-01

    Tectonic framework of Bangladesh and adjoining areas indicate that Bangladesh lies well within an active seismic zone. The after effect of earthquake is more severe in an underdeveloped and a densely populated country like ours than any other developed countries. Bangladesh National Building Code (BNBC) was first established in 1993 to provide guidelines for design and construction of new structure subject to earthquake ground motions in order to minimize the risk to life for all structures. A revision of BNBC 1993 is undergoing to make this up to date with other international building codes. This paper aims at the comparison of various provisions of seismic analysis as given in building codes of different countries. This comparison will give an idea regarding where our country stands when it comes to safety against earth quake. Primarily, various seismic parameters in BNBC 2010 (draft) have been studied and compared with that of BNBC 1993. Later, both 1993 and 2010 edition of BNBC codes have been compared graphically with building codes of other countries such as National Building Code of India 2005 (NBC-India 2005), American Society of Civil Engineering 7-05 (ASCE 7-05). The base shear/weight ratios have been plotted against the height of the building. The investigation in this paper reveals that BNBC 1993 has the least base shear among all the codes. Factored Base shear values of BNBC 2010 are found to have increased significantly than that of BNBC 1993 for low rise buildings (≤20 m) around the country than its predecessor. Despite revision of the code, BNBC 2010 (draft) still suggests less base shear values when compared to the Indian and American code. Therefore, this increase in factor of safety against the earthquake imposed by the proposed BNBC 2010 code by suggesting higher values of base shear is appreciable.

  16. Ideal Binocular Disparity Detectors Learned Using Independent Subspace Analysis on Binocular Natural Image Pairs

    PubMed Central

    Hunter, David W.; Hibbard, Paul B.

    2016-01-01

    An influential theory of mammalian vision, known as the efficient coding hypothesis, holds that early stages in the visual cortex attempts to form an efficient coding of ecologically valid stimuli. Although numerous authors have successfully modelled some aspects of early vision mathematically, closer inspection has found substantial discrepancies between the predictions of some of these models and observations of neurons in the visual cortex. In particular analysis of linear-non-linear models of simple-cells using Independent Component Analysis has found a strong bias towards features on the horoptor. In order to investigate the link between the information content of binocular images, mathematical models of complex cells and physiological recordings, we applied Independent Subspace Analysis to binocular image patches in order to learn a set of complex-cell-like models. We found that these complex-cell-like models exhibited a wide range of binocular disparity-discriminability, although only a minority exhibited high binocular discrimination scores. However, in common with the linear-non-linear model case we found that feature detection was limited to the horoptor suggesting that current mathematical models are limited in their ability to explain the functionality of the visual cortex. PMID:26982184

  17. Improvements of embedded zerotree wavelet (EZW) coding

    NASA Astrophysics Data System (ADS)

    Li, Jin; Cheng, Po-Yuen; Kuo, C.-C. Jay

    1995-04-01

    In this research, we investigate several improvements of embedded zerotree wavelet (EZW) coding. Several topics addressed include: the choice of wavelet transforms and boundary conditions, the use of arithmetic coder and arithmetic context and the design of encoding order for effective embedding. The superior performance of our improvements is demonstrated with extensive experimental results.

  18. Summary of 1990 Code Conference

    SciTech Connect

    Cooper, R.K.; Chan, Kwok-Chi D.

    1990-01-01

    The Conference on Codes and the Linear Accelerator Community was held in Los Alamos in January 1990, and had approximately 100 participants. This conference was the second in a series which has as its goal the exchange of information about codes and code practices among those writing and actually using these codes for the design and analysis of linear accelerators and their components. The first conference was held in San Diego in January 1988, and concentrated on beam dynamics codes and Maxwell solvers. This most recent conference concentrated on 3-D codes and techniques to handle the large amounts of data required for three-dimensional problems. In addition to descriptions of codes, their algorithms and implementations, there were a number of paper describing the use of many of the codes. Proceedings of both these conferences are available. 3 refs., 2 tabs.

  19. Chemical Laser Computer Code Survey,

    DTIC Science & Technology

    1980-12-01

    DOCUMENTATION: Resonator Geometry Synthesis Code Requi rement NV. L. Gamiz); Incorporate General Resonator into Ray Trace Code (W. H. Southwell... Synthesis Code Development (L. R. Stidhm) CATEGRY ATIUEOPTICS KINETICS GASOYNAM41CS None * None *iNone J.LEVEL Simrple Fabry Perot Simple SaturatedGt... Synthesis Co2de Require- ment (V L. ami l ncor~orate General Resonatorn into Ray Trace Code (W. H. Southwel) Srace Optimization Algorithms and Equations (W

  20. Energy Codes and Standards: Facilities

    SciTech Connect

    Bartlett, Rosemarie; Halverson, Mark A.; Shankle, Diana L.

    2007-01-01

    Energy codes and standards play a vital role in the marketplace by setting minimum requirements for energy-efficient design and construction. They outline uniform requirements for new buildings as well as additions and renovations. This article covers basic knowledge of codes and standards; development processes of each; adoption, implementation, and enforcement of energy codes and standards; and voluntary energy efficiency programs.

  1. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  2. IRIG Serial Time Code Formats

    DTIC Science & Technology

    2016-08-01

    TELECOMMUNICATIONS AND TIMING GROUP IRIG STANDARD 200-16 IRIG SERIAL TIME CODE FORMATS DISTRIBUTION A: APPROVED FOR...ARNOLD ENGINEERING DEVELOPMENT COMPLEX NATIONAL AERONAUTICS AND SPACE ADMINISTRATION This page intentionally left blank. IRIG SERIAL TIME CODE ...Serial Time Code Formats, RCC 200-16, August 2016 v Table of Contents Preface

  3. Coding Major Fields of Study.

    ERIC Educational Resources Information Center

    Bobbitt, L. G.; Carroll, C. D.

    The National Center for Education Statistics conducts surveys which require the coding of the respondent's major field of study. This paper presents a new system for the coding of major field of study. It operates on-line i a Computer Assisted Telephone Interview (CATI) environment and allows conversational checks to verify coding directly from…

  4. Improved code-tracking loop

    NASA Technical Reports Server (NTRS)

    Laflame, D. T.

    1980-01-01

    Delay-locked loop tracks pseudonoise codes without introducing dc timing errors, because it is not sensitive to gain imbalance between signal processing arms. "Early" and "late" reference codes pass in combined form through both arms, and each arm acts on both codes. Circuit accomodates 1 dB weaker input signals with tracking ability equal to that of tau-dither loops.

  5. Validation of the BEPLATE code

    SciTech Connect

    Giles, G.E.; Bullock, J.S.

    1997-11-01

    The electroforming simulation code BEPLATE (Boundary Element-PLATE) has been developed and validated for specific applications at Oak Ridge. New areas of application are opening up and more validations are being performed. This paper reports the validation experience of the BEPLATE code on two types of electroforms and describes some recent applications of the code.

  6. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  7. Strong independent association between obesity and essential hypertension.

    PubMed

    Movahed, M R; Lee, J Z; Lim, W Y; Hashemzadeh, M; Hashemzadeh, M

    2016-06-01

    Obesity and hypertension (HTN) are major risk factors for cardiovascular disease. Association between obesity and HTN has not been studied in a large populations following adjustment for comorbidities. The goal of this study was to evaluate any association between obesity and HTN after adjusting for baseline characteristics. We used ICD-9 codes for obesity and HTN from the Nationwide Inpatient Sample (NIS) databases. Two randomly selected years, 1992 and 2002, were chosen from the databases as two independent samples. We used uni- and multivariable analysis to study any correlation between obesity and HTN. The 1992 database contained a total of 6,195,744 patients. HTN was present in 37.2 % of patients with obesity versus 12% of the control group (OR: 4.36, CI 4.30-4.42, P < 0.001). The 2002 database contained a total of 7,153,982 patients. HTN was present in 50.7% of patients with obesity versus 25.6% of the control group (OR: 2.98, CI 2.96-3.00, P < 0.001). Using multivariable analysis adjusting for gender, hyperlipidaemia, age, smoking, type 2 diabetes and chronic renal failure, obesity remained correlated with HTN in both years (1992: OR 2.69, CI 2.67-2.72, P < 0.001; 2002: OR 2.98, CI 2.96-3.00, P < 0.001). The presence of obesity was found to be strongly and independently associated with HTN. The cause of this correlation is not known warranting further investigation.

  8. Ptolemy Coding Style

    DTIC Science & Technology

    2014-09-05

    COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Ptolemy Coding Style 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...lisp module for GNU Emacs that has appropriate indenting rules. This file works well with Emacs under both Unix and Windows. • testsuite/ptspell is a...Unix. It is much more liberal that the commonly used “GPL” or “ GNU Public License,” which encumbers the software and derivative works with the

  9. Structured error recovery for code-word-stabilized quantum codes

    SciTech Connect

    Li Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P.

    2010-05-15

    Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3{sup t} times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

  10. Structured error recovery for code-word-stabilized quantum codes

    NASA Astrophysics Data System (ADS)

    Li, Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P.

    2010-05-01

    Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3t times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

  11. Low Density Parity Check Codes: Bandwidth Efficient Channel Coding

    NASA Technical Reports Server (NTRS)

    Fong, Wai; Lin, Shu; Maki, Gary; Yeh, Pen-Shu

    2003-01-01

    Low Density Parity Check (LDPC) Codes provide near-Shannon Capacity performance for NASA Missions. These codes have high coding rates R=0.82 and 0.875 with moderate code lengths, n=4096 and 8176. Their decoders have inherently parallel structures which allows for high-speed implementation. Two codes based on Euclidean Geometry (EG) were selected for flight ASIC implementation. These codes are cyclic and quasi-cyclic in nature and therefore have a simple encoder structure. This results in power and size benefits. These codes also have a large minimum distance as much as d,,, = 65 giving them powerful error correcting capabilities and error floors less than lo- BER. This paper will present development of the LDPC flight encoder and decoder, its applications and status.

  12. New quantum codes constructed from quaternary BCH codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  13. Quantum Codes From Cyclic Codes Over The Ring R2

    NASA Astrophysics Data System (ADS)

    Altinel, Alev; Güzeltepe, Murat

    2016-10-01

    Let R 2 denotes the ring F 2 + μF 2 + υ2 + μυF 2 + wF 2 + μwF 2 + υwF 2 + μυwF2. In this study, we construct quantum codes from cyclic codes over the ring R2, for arbitrary length n, with the restrictions μ2 = 0, υ2 = 0, w 2 = 0, μυ = υμ, μw = wμ, υw = wυ and μ (υw) = (μυ) w. Also, we give a necessary and sufficient condition for cyclic codes over R2 that contains its dual. As a final point, we obtain the parameters of quantum error-correcting codes from cyclic codes over R2 and we give an example of quantum error-correcting codes form cyclic codes over R 2.

  14. Noiseless coding for the magnetometer

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Lee, Jun-Ji

    1987-01-01

    Future unmanned space missions will continue to seek a full understanding of magnetic fields throughout the solar system. Severely constrained data rates during certain portions of these missions could limit the possible science return. This publication investigates the application of universal noiseless coding techniques to more efficiently represent magnetometer data without any loss in data integrity. Performance results indicated that compression factors of 2:1 to 6:1 can be expected. Feasibility for general deep space application was demonstrated by implementing a microprocessor breadboard coder/decoder using the Intel 8086 processor. The Comet Rendezvous Asteroid Flyby mission will incorporate these techniques in a buffer feedback, rate-controlled configuration. The characteristics of this system are discussed.

  15. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  16. Genetic code for sine

    NASA Astrophysics Data System (ADS)

    Abdullah, Alyasa Gan; Wah, Yap Bee

    2015-02-01

    The computation of the approximate values of the trigonometric sines was discovered by Bhaskara I (c. 600-c.680), a seventh century Indian mathematician and is known as the Bjaskara's I's sine approximation formula. The formula is given in his treatise titled Mahabhaskariya. In the 14th century, Madhava of Sangamagrama, a Kerala mathematician astronomer constructed the table of trigonometric sines of various angles. Madhava's table gives the measure of angles in arcminutes, arcseconds and sixtieths of an arcsecond. The search for more accurate formulas led to the discovery of the power series expansion by Madhava of Sangamagrama (c.1350-c. 1425), the founder of the Kerala school of astronomy and mathematics. In 1715, the Taylor series was introduced by Brook Taylor an English mathematician. If the Taylor series is centered at zero, it is called a Maclaurin series, named after the Scottish mathematician Colin Maclaurin. Some of the important Maclaurin series expansions include trigonometric functions. This paper introduces the genetic code of the sine of an angle without using power series expansion. The genetic code using square root approach reveals the pattern in the signs (plus, minus) and sequence of numbers in the sine of an angle. The square root approach complements the Pythagoras method, provides a better understanding of calculating an angle and will be useful for teaching the concepts of angles in trigonometry.

  17. FAST GYROSYNCHROTRON CODES

    SciTech Connect

    Fleishman, Gregory D.; Kuznetsov, Alexey A.

    2010-10-01

    Radiation produced by charged particles gyrating in a magnetic field is highly significant in the astrophysics context. Persistently increasing resolution of astrophysical observations calls for corresponding three-dimensional modeling of the radiation. However, available exact equations are prohibitively slow in computing a comprehensive table of high-resolution models required for many practical applications. To remedy this situation, we develop approximate gyrosynchrotron (GS) codes capable of quickly calculating the GS emission (in non-quantum regime) from both isotropic and anisotropic electron distributions in non-relativistic, mildly relativistic, and ultrarelativistic energy domains applicable throughout a broad range of source parameters including dense or tenuous plasmas and weak or strong magnetic fields. The computation time is reduced by several orders of magnitude compared with the exact GS algorithm. The new algorithm performance can gradually be adjusted to the user's needs depending on whether precision or computation speed is to be optimized for a given model. The codes are made available for users as a supplement to this paper.

  18. Reliability of ICD-10 external cause of death codes in the National Coroners Information System.

    PubMed

    Bugeja, Lyndal; Clapperton, Angela J; Killian, Jessica J; Stephan, Karen L; Ozanne-Smith, Joan

    2010-01-01

    Availability of ICD-10 cause of death codes in the National Coroners Information System (NCIS) strengthens its value as a public health surveillance tool. This study quantified the completeness of external cause ICD-10 codes in the NCIS for Victorian deaths (as assigned by the Australian Bureau of Statistics (ABS) in the yearly Cause of Death data). It also examined the concordance between external cause ICD-10 codes contained in the NCIS and a re-code of the same deaths conducted by an independent coder. Of 7,400 NCIS external cause deaths included in this study, 961 (13.0%) did not contain an ABS assigned ICD-10 code and 225 (3.0%) contained only a natural cause code. Where an ABS assigned external cause ICD-10 code was present (n=6,214), 4,397 (70.8%) matched exactly with the independently assigned ICD-10 code. Coding disparity primarily related to differences in assignment of intent and specificity. However, in a small number of deaths (n=49, 0.8%) there was coding disparity for both intent and external cause category. NCIS users should be aware of the limitations of relying only on ICD-10 codes contained within the NCIS for deaths prior to 2007 and consider using these in combination with the other NCIS data fields and code sets to ensure optimum case identification.

  19. The APS SASE FEL : modeling and code comparison.

    SciTech Connect

    Biedron, S. G.

    1999-04-20

    A self-amplified spontaneous emission (SASE) free-electron laser (FEL) is under construction at the Advanced Photon Source (APS). Five FEL simulation codes were used in the design phase: GENESIS, GINGER, MEDUSA, RON, and TDA3D. Initial comparisons between each of these independent formulations show good agreement for the parameters of the APS SASE FEL.

  20. The multigrid method for semi-implicit hydrodynamics codes

    SciTech Connect

    Brandt, A.; Dendy, J.E. Jr.; Ruppel, H.

    1980-03-01

    The multigrid method is applied to the pressure iteration in both Eulerian and Lagrangian codes, and computational examples of its efficiency are presented. In addition a general technique for speeding up the calculation of very low Mach number flows is presented. The latter feature is independent of the multigrid algorithm.

  1. Multigrid method for semi-implicit hydrodynamics codes

    SciTech Connect

    Brandt, A.; Dendy, J.E. Jr.; Ruppel, H.

    1980-03-01

    The multigrid method is applied to the pressure iteration in both Eulerian and Lagrangian codes, and computational examples of its efficiency are presented. In addition a general technique for speeding up the calculation of very low Mach number flows is presented. The latter feature is independent of the multigrid algorithm.

  2. Determinate-state convolutional codes

    NASA Technical Reports Server (NTRS)

    Collins, O.; Hizlan, M.

    1991-01-01

    A determinate state convolutional code is formed from a conventional convolutional code by pruning away some of the possible state transitions in the decoding trellis. The type of staged power transfer used in determinate state convolutional codes proves to be an extremely efficient way of enhancing the performance of a concatenated coding system. The decoder complexity is analyzed along with free distances of these new codes and extensive simulation results is provided of their performance at the low signal to noise ratios where a real communication system would operate. Concise, practical examples are provided.

  3. Independent Learning Models: A Comparison.

    ERIC Educational Resources Information Center

    Wickett, R. E. Y.

    Five models of independent learning are suitable for use in adult education programs. The common factor is a facilitator who works in some way with the student in the learning process. They display different characteristics, including the extent of independence in relation to content and/or process. Nondirective tutorial instruction and learning…

  4. Distributed Joint Source-Channel Coding in Wireless Sensor Networks

    PubMed Central

    Zhu, Xuqi; Liu, Yu; Zhang, Lin

    2009-01-01

    Considering the fact that sensors are energy-limited and the wireless channel conditions in wireless sensor networks, there is an urgent need for a low-complexity coding method with high compression ratio and noise-resisted features. This paper reviews the progress made in distributed joint source-channel coding which can address this issue. The main existing deployments, from the theory to practice, of distributed joint source-channel coding over the independent channels, the multiple access channels and the broadcast channels are introduced, respectively. To this end, we also present a practical scheme for compressing multiple correlated sources over the independent channels. The simulation results demonstrate the desired efficiency. PMID:22408560

  5. Circular codes, symmetries and transformations.

    PubMed

    Fimmel, Elena; Giannerini, Simone; Gonzalez, Diego Luis; Strüngmann, Lutz

    2015-06-01

    Circular codes, putative remnants of primeval comma-free codes, have gained considerable attention in the last years. In fact they represent a second kind of genetic code potentially involved in detecting and maintaining the normal reading frame in protein coding sequences. The discovering of an universal code across species suggested many theoretical and experimental questions. However, there is a key aspect that relates circular codes to symmetries and transformations that remains to a large extent unexplored. In this article we aim at addressing the issue by studying the symmetries and transformations that connect different circular codes. The main result is that the class of 216 C3 maximal self-complementary codes can be partitioned into 27 equivalence classes defined by a particular set of transformations. We show that such transformations can be put in a group theoretic framework with an intuitive geometric interpretation. More general mathematical results about symmetry transformations which are valid for any kind of circular codes are also presented. Our results pave the way to the study of the biological consequences of the mathematical structure behind circular codes and contribute to shed light on the evolutionary steps that led to the observed symmetries of present codes.

  6. Making your code citable with the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; DuPrie, Kimberly; Schmidt, Judy; Berriman, G. Bruce; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2016-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. With nearly 1,200 codes, it is the largest indexed resource for astronomy codes in existence. Established in 1999, it offers software authors a path to citation of their research codes even without publication of a paper describing the software, and offers scientists a way to find codes used in refereed publications, thus improving the transparency of the research. It also provides a method to quantify the impact of source codes in a fashion similar to the science metrics of journal articles. Citations using ASCL IDs are accepted by major astronomy journals and if formatted properly are tracked by ADS and other indexing services. The number of citations to ASCL entries increased sharply from 110 citations in January 2014 to 456 citations in September 2015. The percentage of code entries in ASCL that were cited at least once rose from 7.5% in January 2014 to 17.4% in September 2015. The ASCL's mid-2014 infrastructure upgrade added an easy entry submission form, more flexible browsing, search capabilities, and an RSS feeder for updates. A Changes/Additions form added this past fall lets authors submit links for papers that use their codes for addition to the ASCL entry even if those papers don't formally cite the codes, thus increasing the transparency of that research and capturing the value of their software to the community.

  7. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  8. Fire investigation

    NASA Astrophysics Data System (ADS)

    Gomberg, A.

    There was considerable progress made on several fronts of fire investigation in the United States in recent years. Progress was made in increasing the quantity of fire investigation and reporting, through efforts to develop the National Fire Incident Reporting System. Improving overall quality of fire investigation is the objective of efforts such as the Fire Investigation Handbook, which was developed and published by the National Bureau of Standards, and the upgrading and expanding of the ""dictionary'' of fire investigation and reporting, the NFPA 901, Uniform Coding for Fire Protection, system. The science of fire investigation as furthered also by new approaches to post fire interviews being developed at the University of Washington, and by in-depth research into factors involved in several large loss fires, including the MGM Grand Hotel in Las Vegas. Finally, the use of special study fire investigations - in-depth investigations concentrating on specific fire problems - is producing new glimpses into the nature of the national fire problem. A brief description of the status of efforts in each of these areas is discussed.

  9. The metaethics of nursing codes of ethics and conduct.

    PubMed

    Snelling, Paul C

    2016-10-01

    Nursing codes of ethics and conduct are features of professional practice across the world, and in the UK, the regulator has recently consulted on and published a new code. Initially part of a professionalising agenda, nursing codes have recently come to represent a managerialist and disciplinary agenda and nursing can no longer be regarded as a self-regulating profession. This paper argues that codes of ethics and codes of conduct are significantly different in form and function similar to the difference between ethics and law in everyday life. Some codes successfully integrate these two functions within the same document, while others, principally the UK Code, conflate them resulting in an ambiguous document unable to fulfil its functions effectively. The paper analyses the differences between ethical-codes and conduct-codes by discussing titles, authorship, level, scope for disagreement, consequences of transgression, language and finally and possibly most importantly agent-centeredness. It is argued that conduct-codes cannot require nurses to be compassionate because compassion involves an emotional response. The concept of kindness provides a plausible alternative for conduct-codes as it is possible to understand it solely in terms of acts. But if kindness is required in conduct-codes, investigation and possible censure follows from its absence. Using examples it is argued that there are at last five possible accounts of the absence of kindness. As well as being potentially problematic for disciplinary panels, difficulty in understanding the features of blameworthy absence of kindness may challenge UK nurses who, following a recently introduced revalidation procedure, are required to reflect on their practice in relation to The Code. It is concluded that closer attention to metaethical concerns by code writers will better support the functions of their issuing organisations.

  10. The best bits in an iris code.

    PubMed

    Hollingsworth, Karen P; Bowyer, Kevin W; Flynn, Patrick J

    2009-06-01

    Iris biometric systems apply filters to iris images to extract information about iris texture. Daugman's approach maps the filter output to a binary iris code. The fractional Hamming distance between two iris codes is computed and decisions about the identity of a person are based on the computed distance. The fractional Hamming distance weights all bits in an iris code equally. However, not all the bits in an iris code are equally useful. Our research is the first to present experiments documenting that some bits are more consistent than others. Different regions of the iris are compared to evaluate their relative consistency, and contrary to some previous research, we find that the middle bands of the iris are more consistent than the inner bands. The inconsistent-bit phenomenon is evident across genders and different filter types. Possible causes of inconsistencies, such as segmentation, alignment issues, and different filters are investigated. The inconsistencies are largely due to the coarse quantization of the phase response. Masking iris code bits corresponding to complex filter responses near the axes of the complex plane improves the separation between the match and nonmatch Hamming distance distributions.

  11. An experimental investigation of clocking effects on turbine aerodynamics using a modern 3-D one and one-half stage high pressure turbine for code verification and flow model development

    NASA Astrophysics Data System (ADS)

    Haldeman, Charles Waldo, IV

    2003-10-01

    This research uses a modern 1 and 1/2 stage high-pressure (HP) turbine operating at the proper design corrected speed, pressure ratio, and gas to metal temperature ratio to generate a detailed data set containing aerodynamic, heat-transfer and aero-performance information. The data was generated using the Ohio State University Gas Turbine Laboratory Turbine Test Facility (TTF), which is a short-duration shock tunnel facility. The research program utilizes an uncooled turbine stage for which all three airfoils are heavily instrumented at multiple spans and on the HPV and LPV endwalls and HPB platform and tips. Heat-flux and pressure data are obtained using the traditional shock-tube and blowdown facility operational modes. Detailed examination show that the aerodynamic (pressure) data obtained in the blowdown mode is the same as obtained in the shock-tube mode when the corrected conditions are matched. Various experimental conditions and configurations were performed, including LPV clocking positions, off-design corrected speed conditions, pressure ratio changes, and Reynolds number changes. The main research for this dissertation is concentrated on the LPV clocking experiments, where the LPV was clocked relative to the HPV at several different passage locations and at different Reynolds numbers. Various methods were used to evaluate the effect of clocking on both the aeroperformance (efficiency) and aerodynamics (pressure loading) on the LPV, including time-resolved measurements, time-averaged measurements and stage performance measurements. A general improvement in overall efficiency of approximately 2% is demonstrated and could be observed using a variety of independent methods. Maximum efficiency is obtained when the time-average pressures are highest on the LPV, and the time-resolved data both in the time domain and frequency domain show the least amount of variation. The gain in aeroperformance is obtained by integrating over the entire airfoil as the three

  12. Generating Code Review Documentation for Auto-Generated Mission-Critical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2009-01-01

    Model-based design and automated code generation are increasingly used at NASA to produce actual flight code, particularly in the Guidance, Navigation, and Control domain. However, since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently auto-generated code still needs to be fully tested and certified. We have thus developed AUTOCERT, a generator-independent plug-in that supports the certification of auto-generated code. AUTOCERT takes a set of mission safety requirements, and formally verifies that the autogenerated code satisfies these requirements. It generates a natural language report that explains why and how the code complies with the specified requirements. The report is hyper-linked to both the program and the verification conditions and thus provides a high-level structured argument containing tracing information for use in code reviews.

  13. IGB grid: User's manual (A turbomachinery grid generation code)

    NASA Technical Reports Server (NTRS)

    Beach, T. A.; Hoffman, G.

    1992-01-01

    A grid generation code called IGB is presented for use in computational investigations of turbomachinery flowfields. It contains a combination of algebraic and elliptic techniques coded for use on an interactive graphics workstation. The instructions for use and a test case are included.

  14. Perceptually lossless coding of digital monochrome ultrasound images

    NASA Astrophysics Data System (ADS)

    Wu, David; Tan, Damian M.; Griffiths, Tania; Wu, Hong Ren

    2005-07-01

    A preliminary investigation of encoding monochrome ultrasound images with a novel perceptually lossless coder is presented. Based on the JPEG 2000 coding framework, the proposed coder employs a vision model to identify and remove visually insignificant/irrelevant information. Current simulation results have shown coding performance gains over the JPEG compliant LOCO lossless and JPEG 2000 lossless coders without any perceivable distortion.

  15. The Role of Code-Switching in Bilingual Creativity

    ERIC Educational Resources Information Center

    Kharkhurin, Anatoliy V.; Wei, Li

    2015-01-01

    This study further explores the theme of bilingual creativity with the present focus on code-switching. Specifically, it investigates whether code-switching practice has an impact on creativity. In line with the previous research, selective attention was proposed as a potential cognitive mechanism, which on the one hand would benefit from…

  16. Code-Mixing as a Bilingual Instructional Strategy

    ERIC Educational Resources Information Center

    Jiang, Yih-Lin Belinda; García, Georgia Earnest; Willis, Arlette Ingram

    2014-01-01

    This study investigated code-mixing practices, specifically the use of L2 (English) in an L1 (Chinese) class in a U.S. bilingual program. Our findings indicate that the code-mixing practices made and prompted by the teacher served five pedagogical functions: (a) to enhance students' bilingualism and bilingual learning, (b) to review and…

  17. 32 CFR 636.11 - Installation traffic codes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 4 2014-07-01 2013-07-01 true Installation traffic codes. 636.11 Section 636.11... CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION (SPECIFIC INSTALLATIONS) Fort Stewart, Georgia § 636.11 Installation traffic codes. In addition to the requirements in § 634.25(d) of this...

  18. 32 CFR 636.11 - Installation traffic codes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 4 2013-07-01 2013-07-01 false Installation traffic codes. 636.11 Section 636... ENFORCEMENT AND CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION (SPECIFIC INSTALLATIONS) Fort Stewart, Georgia § 636.11 Installation traffic codes. In addition to the requirements in § 634.25(d)...

  19. 32 CFR 636.11 - Installation traffic codes

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 4 2012-07-01 2011-07-01 true Installation traffic codes 636.11 Section 636.11... CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION (SPECIFIC INSTALLATIONS) Fort Stewart, Georgia § 636.11 Installation traffic codes In addition to the requirements in § 634.25(d) of this...

  20. 32 CFR 636.11 - Installation traffic codes

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 4 2011-07-01 2011-07-01 false Installation traffic codes 636.11 Section 636.11... CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION (SPECIFIC INSTALLATIONS) Fort Stewart, Georgia § 636.11 Installation traffic codes In addition to the requirements in § 634.25(d) of this...

  1. Electromagnetic particle simulation codes

    NASA Technical Reports Server (NTRS)

    Pritchett, P. L.

    1985-01-01

    Electromagnetic particle simulations solve the full set of Maxwell's equations. They thus include the effects of self-consistent electric and magnetic fields, magnetic induction, and electromagnetic radiation. The algorithms for an electromagnetic code which works directly with the electric and magnetic fields are described. The fields and current are separated into transverse and longitudinal components. The transverse E and B fields are integrated in time using a leapfrog scheme applied to the Fourier components. The particle pushing is performed via the relativistic Lorentz force equation for the particle momentum. As an example, simulation results are presented for the electron cyclotron maser instability which illustrate the importance of relativistic effects on the wave-particle resonance condition and on wave dispersion.

  2. Telescope Adaptive Optics Code

    SciTech Connect

    Phillion, D.

    2005-07-28

    The Telescope AO Code has general adaptive optics capabilities plus specialized models for three telescopes with either adaptive optics or active optics systems. It has the capability to generate either single-layer or distributed Kolmogorov turbulence phase screens using the FFT. Missing low order spatial frequencies are added using the Karhunen-Loeve expansion. The phase structure curve is extremely dose to the theoreUcal. Secondly, it has the capability to simulate an adaptive optics control systems. The default parameters are those of the Keck II adaptive optics system. Thirdly, it has a general wave optics capability to model the science camera halo due to scintillation from atmospheric turbulence and the telescope optics. Although this capability was implemented for the Gemini telescopes, the only default parameter specific to the Gemini telescopes is the primary mirror diameter. Finally, it has a model for the LSST active optics alignment strategy. This last model is highly specific to the LSST

  3. Code lock with microcircuit

    NASA Astrophysics Data System (ADS)

    Korobka, A.; May, I.

    1985-01-01

    A code lock with a microcircuit was invented which contains only a very few components. Two DD-triggers control the state of two identical transistors. When both transistors are turned on simultaneously the transistor VS1 is turned on so that the electromagnet YA1 pulls in the bolt and the door opens. This will happen only when a logic 1 appears at the inverted output of the first trigger and at the straight output of the second one. After the door is opened, a button on it resets the contactors to return both triggers to their original state. The electromagnetic is designed to produce the necessary pull force and sufficient power when under rectified 127 V line voltage, with the neutral wire of the lock circuit always connected to the - terminal of the power supply.

  4. Peripheral coding of taste

    PubMed Central

    Liman, Emily R.; Zhang, Yali V.; Montell, Craig

    2014-01-01

    Five canonical tastes, bitter, sweet, umami (amino acid), salty and sour (acid) are detected by animals as diverse as fruit flies and humans, consistent with a near universal drive to consume fundamental nutrients and to avoid toxins or other harmful compounds. Surprisingly, despite this strong conservation of basic taste qualities between vertebrates and invertebrates, the receptors and signaling mechanisms that mediate taste in each are highly divergent. The identification over the last two decades of receptors and other molecules that mediate taste has led to stunning advances in our understanding of the basic mechanisms of transduction and coding of information by the gustatory systems of vertebrates and invertebrates. In this review, we discuss recent advances in taste research, mainly from the fly and mammalian systems, and we highlight principles that are common across species, despite stark differences in receptor types. PMID:24607224

  5. Accurate coding in sepsis: clinical significance and financial implications.

    PubMed

    Chin, Y T; Scattergood, N; Thornber, M; Thomas, S

    2016-09-01

    Sepsis is a major healthcare problem and leading cause of death worldwide. UK hospital mortality statistics and payments for patient episodes of care are calculated on clinical coding data. The accuracy of these data depends on the quality of coding. This study aimed to investigate whether patients with significant bacteraemia are coded for sepsis and to estimate the financial costs of miscoding. Of 54 patients over a one-month period with a significant bacteraemia, only 19% had been coded for sepsis. This is likely to lead to falsely high calculated hospital mortality. Furthermore, this resulted in an underpayment of £21,000 for one month alone.

  6. Surface acoustic wave coding for orthogonal frequency coded devices

    NASA Technical Reports Server (NTRS)

    Malocha, Donald (Inventor); Kozlovski, Nikolai (Inventor)

    2011-01-01

    Methods and systems for coding SAW OFC devices to mitigate code collisions in a wireless multi-tag system. Each device producing plural stepped frequencies as an OFC signal with a chip offset delay to increase code diversity. A method for assigning a different OCF to each device includes using a matrix based on the number of OFCs needed and the number chips per code, populating each matrix cell with OFC chip, and assigning the codes from the matrix to the devices. The asynchronous passive multi-tag system includes plural surface acoustic wave devices each producing a different OFC signal having the same number of chips and including a chip offset time delay, an algorithm for assigning OFCs to each device, and a transceiver to transmit an interrogation signal and receive OFC signals in response with minimal code collisions during transmission.

  7. Improved lossless intra coding for next generation video coding

    NASA Astrophysics Data System (ADS)

    Vanam, Rahul; He, Yuwen; Ye, Yan

    2016-09-01

    Recently, there have been efforts by the ITU-T VCEG and ISO/IEC MPEG to further improve the compression performance of the High Efficiency Video Coding (HEVC) standard for developing a potential next generation video coding standard. The exploratory codec software of this potential standard includes new coding tools for inter and intra coding. In this paper, we present a new intra prediction mode for lossless intra coding. Our new intra mode derives a prediction filter for each input pixel using its neighboring reconstructed pixels, and applies this filter to the nearest neighboring reconstructed pixels to generate a prediction pixel. The proposed intra mode is demonstrated to improve the performance of the exploratory software for lossless intra coding, yielding a maximum and average bitrate savings of 4.4% and 2.11%, respectively.

  8. Mean-based neural coding of voices.

    PubMed

    Andics, Attila; McQueen, James M; Petersson, Karl Magnus

    2013-10-01

    The social significance of recognizing the person who talks to us is obvious, but the neural mechanisms that mediate talker identification are unclear. Regions along the bilateral superior temporal sulcus (STS) and the inferior frontal cortex (IFC) of the human brain are selective for voices, and they are sensitive to rapid voice changes. Although it has been proposed that voice recognition is supported by prototype-centered voice representations, the involvement of these category-selective cortical regions in the neural coding of such "mean voices" has not previously been demonstrated. Using fMRI in combination with a voice identity learning paradigm, we show that voice-selective regions are involved in the mean-based coding of voice identities. Voice typicality is encoded on a supra-individual level in the right STS along a stimulus-dependent, identity-independent (i.e., voice-acoustic) dimension, and on an intra-individual level in the right IFC along a stimulus-independent, identity-dependent (i.e., voice identity) dimension. Voice recognition therefore entails at least two anatomically separable stages, each characterized by neural mechanisms that reference the central tendencies of voice categories.

  9. Multiple Independent File Parallel I/O with HDF5

    SciTech Connect

    Miller, M. C.

    2016-07-13

    The HDF5 library has supported the I/O requirements of HPC codes at Lawrence Livermore National Labs (LLNL) since the late 90’s. In particular, HDF5 used in the Multiple Independent File (MIF) parallel I/O paradigm has supported LLNL code’s scalable I/O requirements and has recently been gainfully used at scales as large as O(106) parallel tasks.

  10. CAFE: A NEW RELATIVISTIC MHD CODE

    SciTech Connect

    Lora-Clavijo, F. D.; Cruz-Osorio, A.; Guzmán, F. S. E-mail: aosorio@astro.unam.mx

    2015-06-22

    We introduce CAFE, a new independent code designed to solve the equations of relativistic ideal magnetohydrodynamics (RMHD) in three dimensions. We present the standard tests for an RMHD code and for the relativistic hydrodynamics regime because we have not reported them before. The tests include the one-dimensional Riemann problems related to blast waves, head-on collisions of streams, and states with transverse velocities, with and without magnetic field, which is aligned or transverse, constant or discontinuous across the initial discontinuity. Among the two-dimensional (2D) and 3D tests without magnetic field, we include the 2D Riemann problem, a one-dimensional shock tube along a diagonal, the high-speed Emery wind tunnel, the Kelvin–Helmholtz (KH) instability, a set of jets, and a 3D spherical blast wave, whereas in the presence of a magnetic field we show the magnetic rotor, the cylindrical explosion, a case of Kelvin–Helmholtz instability, and a 3D magnetic field advection loop. The code uses high-resolution shock-capturing methods, and we present the error analysis for a combination that uses the Harten, Lax, van Leer, and Einfeldt (HLLE) flux formula combined with a linear, piecewise parabolic method and fifth-order weighted essentially nonoscillatory reconstructors. We use the flux-constrained transport and the divergence cleaning methods to control the divergence-free magnetic field constraint.

  11. CAFE: A New Relativistic MHD Code

    NASA Astrophysics Data System (ADS)

    Lora-Clavijo, F. D.; Cruz-Osorio, A.; Guzmán, F. S.

    2015-06-01

    We introduce CAFE, a new independent code designed to solve the equations of relativistic ideal magnetohydrodynamics (RMHD) in three dimensions. We present the standard tests for an RMHD code and for the relativistic hydrodynamics regime because we have not reported them before. The tests include the one-dimensional Riemann problems related to blast waves, head-on collisions of streams, and states with transverse velocities, with and without magnetic field, which is aligned or transverse, constant or discontinuous across the initial discontinuity. Among the two-dimensional (2D) and 3D tests without magnetic field, we include the 2D Riemann problem, a one-dimensional shock tube along a diagonal, the high-speed Emery wind tunnel, the Kelvin-Helmholtz (KH) instability, a set of jets, and a 3D spherical blast wave, whereas in the presence of a magnetic field we show the magnetic rotor, the cylindrical explosion, a case of Kelvin-Helmholtz instability, and a 3D magnetic field advection loop. The code uses high-resolution shock-capturing methods, and we present the error analysis for a combination that uses the Harten, Lax, van Leer, and Einfeldt (HLLE) flux formula combined with a linear, piecewise parabolic method and fifth-order weighted essentially nonoscillatory reconstructors. We use the flux-constrained transport and the divergence cleaning methods to control the divergence-free magnetic field constraint.

  12. Sequence independent amplification of DNA

    DOEpatents

    Bohlander, Stefan K.

    1998-01-01

    The present invention is a rapid sequence-independent amplification procedure (SIA). Even minute amounts of DNA from various sources can be amplified independent of any sequence requirements of the DNA or any a priori knowledge of any sequence characteristics of the DNA to be amplified. This method allows, for example the sequence independent amplification of microdissected chromosomal material and the reliable construction of high quality fluorescent in situ hybridization (FISH) probes from YACs or from other sources. These probes can be used to localize YACs on metaphase chromosomes but also--with high efficiency--in interphase nuclei.

  13. Sequence independent amplification of DNA

    DOEpatents

    Bohlander, S.K.

    1998-03-24

    The present invention is a rapid sequence-independent amplification procedure (SIA). Even minute amounts of DNA from various sources can be amplified independent of any sequence requirements of the DNA or any a priori knowledge of any sequence characteristics of the DNA to be amplified. This method allows, for example, the sequence independent amplification of microdissected chromosomal material and the reliable construction of high quality fluorescent in situ hybridization (FISH) probes from YACs or from other sources. These probes can be used to localize YACs on metaphase chromosomes but also--with high efficiency--in interphase nuclei. 25 figs.

  14. Adaptive EZW coding using a rate-distortion criterion

    NASA Astrophysics Data System (ADS)

    Yin, Che-Yi

    2001-07-01

    This work presents a new method that improves on the EZW image coding algorithm. The standard EZW image coder uses a uniform quantizer with a threshold (deadzone) that is identical in all subbands. The quantization step sizes are not optimized under the rate-distortion sense. We modify the EZW by applying the Lagrange multiplier to search for the best step size for each subband and allocate the bit rate for each subband accordingly. Then we implement the adaptive EZW codec to code the wavelet coefficients. Two coding environments, independent and dependent, are considered for the optimization process. The proposed image coder retains all the good features of the EZW, namely, embedded coding, progressive transmission, order of the important bits, and enhances it through the rate-distortion optimization with respect to the step sizes.

  15. Role of precise spike timing in coding of dynamic vibrissa stimuli in somatosensory thalamus.

    PubMed

    Montemurro, Marcelo A; Panzeri, Stefano; Maravall, Miguel; Alenda, Andrea; Bale, Michael R; Brambilla, Marco; Petersen, Rasmus S

    2007-10-01

    Rats discriminate texture by whisking their vibrissae across the surfaces of objects. This process induces corresponding vibrissa vibrations, which must be accurately represented by neurons in the somatosensory pathway. In this study, we investigated the neural code for vibrissa motion in the ventroposterior medial (VPm) nucleus of the thalamus by single-unit recording. We found that neurons conveyed a great deal of information (up to 77.9 bits/s) about vibrissa dynamics. The key was precise spike timing, which typically varied by less than a millisecond from trial to trial. The neural code was sparse, the average spike being remarkably informative (5.8 bits/spike). This implies that as few as four VPm spikes, coding independent information, might reliably differentiate between 10(6) textures. To probe the mechanism of information transmission, we compared the role of time-varying firing rate to that of temporally correlated spike patterns in two ways: 93.9% of the information encoded by a neuron could be accounted for by a hypothetical neuron with the same time-dependent firing rate but no correlations between spikes; moreover, > or =93.4% of the information in the spike trains could be decoded even if temporal correlations were ignored. Taken together, these results suggest that the essence of the VPm code for vibrissa motion is firing rate modulation on a submillisecond timescale. The significance of such a code may be that it enables a small number of neurons, firing only few spikes, to convey distinctions between very many different textures to the barrel cortex.

  16. ALEGRA -- code validation: Experiments and simulations

    SciTech Connect

    Chhabildas, L.C.; Konrad, C.H.; Mosher, D.A.; Reinhart, W.D; Duggins, B.D.; Rodeman, R.; Trucano, T.G.; Summers, R.M.; Peery, J.S.

    1998-03-16

    In this study, the authors are providing an experimental test bed for validating features of the ALEGRA code over a broad range of strain rates with overlapping diagnostics that encompass the multiple responses. A unique feature of the Arbitrary Lagrangian Eulerian Grid for Research Applications (ALEGRA) code is that it allows simultaneous computational treatment, within one code, of a wide range of strain-rates varying from hydrodynamic to structural conditions. This range encompasses strain rates characteristic of shock-wave propagation (10{sup 7}/s) and those characteristic of structural response (10{sup 2}/s). Most previous code validation experimental studies, however, have been restricted to simulating or investigating a single strain-rate regime. What is new and different in this investigation is that the authors have performed well-instrumented experiments which capture features relevant to both hydrodynamic and structural response in a single experiment. Aluminum was chosen for use in this study because it is a well characterized material--its EOS and constitutive material properties are well defined over a wide range of loading rates. The current experiments span strain rate regimes of over 10{sup 7}/s to less than 10{sup 2}/s in a single experiment. The input conditions are extremely well defined. Velocity interferometers are used to record the high strain-rate response, while low strain rate data were collected using strain gauges.

  17. Independent waves in complex source point theory.

    PubMed

    Seshadri, S R

    2007-11-01

    The full-wave generalization of the scalar Gaussian paraxial beam is determined by an analytical continuation of the field of a point source for the Helmholtz equation. The regions of validity of the analytically continued fields are investigated for the outgoing and the incoming waves. The two independent wave functions valid for the two half-spaces separating the secondary source plane are deduced.

  18. Independent Schools: Landscape and Learnings.

    ERIC Educational Resources Information Center

    Oates, William A.

    1981-01-01

    Examines American independent schools (parochial, southern segregated, and private institutions) in terms of their funding, expenditures, changing enrollment patterns, teacher-student ratios, and societal functions. Journal available from Daedalus Subscription Department, 1172 Commonwealth Ave., Boston, MA 02132. (AM)

  19. Technology for Independent Living: Sourcebook.

    ERIC Educational Resources Information Center

    Enders, Alexandra, Ed.

    This sourcebook provides information for the practical implementation of independent living technology in the everyday rehabilitation process. "Information Services and Resources" lists databases, clearinghouses, networks, research and development programs, toll-free telephone numbers, consumer protection caveats, selected publications, and…

  20. International Code Assessment and Applications Program: Summary of code assessment studies concerning RELAP5/MOD2, RELAP5/MOD3, and TRAC-B. International Agreement Report

    SciTech Connect

    Schultz, R.R.

    1993-12-01

    Members of the International Code Assessment Program (ICAP) have assessed the US Nuclear Regulatory Commission (USNRC) advanced thermal-hydraulic codes over the past few years in a concerted effort to identify deficiencies, to define user guidelines, and to determine the state of each code. The results of sixty-two code assessment reviews, conducted at INEL, are summarized. Code deficiencies are discussed and user recommended nodalizations investigated during the course of conducting the assessment studies and reviews are listed. All the work that is summarized was done using the RELAP5/MOD2, RELAP5/MOD3, and TRAC-B codes.

  1. Preliminary Results from the Application of Automated Adjoint Code Generation to CFL3D

    NASA Technical Reports Server (NTRS)

    Carle, Alan; Fagan, Mike; Green, Lawrence L.

    1998-01-01

    This report describes preliminary results obtained using an automated adjoint code generator for Fortran to augment a widely-used computational fluid dynamics flow solver to compute derivatives. These preliminary results with this augmented code suggest that, even in its infancy, the automated adjoint code generator can accurately and efficiently deliver derivatives for use in transonic Euler-based aerodynamic shape optimization problems with hundreds to thousands of independent design variables.

  2. Experimental interference of independent photons.

    PubMed

    Kaltenbaek, Rainer; Blauensteiner, Bibiane; Zukowski, Marek; Aspelmeyer, Markus; Zeilinger, Anton

    2006-06-23

    Interference of photons emerging from independent sources is essential for modern quantum-information processing schemes, above all quantum repeaters and linear-optics quantum computers. We report an observation of nonclassical interference of two single photons originating from two independent, separated sources, which were actively synchronized with a rms timing jitter of 260 fs. The resulting (two-photon) interference visibility was (83+/-4)%.

  3. Independence test for sparse data

    NASA Astrophysics Data System (ADS)

    García, J. E.; González-López, V. A.

    2016-06-01

    In this paper a new non-parametric independence test is presented. García and González-López (2014) [1] introduced the LIS test for the hypothesis of independence between two continuous random variables, the test proposed in this work is a generalization of the LIS test. The new test does not require the assumption of continuity for the random variables, it test is applied to two datasets and also compared with the Pearson's Chi-squared test.

  4. QR code for medical information uses.

    PubMed

    Fontelo, Paul; Liu, Fang; Ducut, Erick G

    2008-11-06

    We developed QR code online tools, simulated and tested QR code applications for medical information uses including scanning QR code labels, URLs and authentication. Our results show possible applications for QR code in medicine.

  5. Code Speed Measuring for VC++

    DTIC Science & Technology

    2015-10-01

    UNCLASSIFIED AD-E403 688 Technical Report ARWSE-TR-14025 CODE SPEED MEASURING FOR VC++ Tom Nealis...TYPE Final 3. DATES COVERED (From – To) 4. TITLE AND SUBTITLE CODE SPEED MEASURING FOR VC++ 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c...ABSTRACT It’s often important to know how fast a snippet of code executes. This information allows the coder to make important decisions

  6. Explosive Formulation Code Naming SOP

    SciTech Connect

    Martz, H. E.

    2014-09-19

    The purpose of this SOP is to provide a procedure for giving individual HME formulations code names. A code name for an individual HME formulation consists of an explosive family code, given by the classified guide, followed by a dash, -, and a number. If the formulation requires preparation such as packing or aging, these add additional groups of symbols to the X-ray specimen name.

  7. Bar-Code-Scribing Tool

    NASA Technical Reports Server (NTRS)

    Badinger, Michael A.; Drouant, George J.

    1991-01-01

    Proposed hand-held tool applies indelible bar code to small parts. Possible to identify parts for management of inventory without tags or labels. Microprocessor supplies bar-code data to impact-printer-like device. Device drives replaceable scribe, which cuts bar code on surface of part. Used to mark serially controlled parts for military and aerospace equipment. Also adapts for discrete marking of bulk items used in food and pharmaceutical processing.

  8. Upgrades to NRLMOL code

    NASA Astrophysics Data System (ADS)

    Basurto, Luis

    This project consists of performing upgrades to the massively parallel NRLMOL electronic structure code in order to enhance its performance by increasing its flexibility by: a) Utilizing dynamically allocated arrays, b) Executing in a parallel environment sections of the program that were previously executed in a serial mode, c) Exploring simultaneous concurrent executions of the program through the use of an already existing MPI environment; thus enabling the simulation of larger systems than it is currently capable of performing. Also developed was a graphical user interface that will allow less experienced users to start performing electronic structure calculations by aiding them in performing the necessary configuration of input files as well as providing graphical tools for the displaying and analysis of results. Additionally, a computational toolkit that can avail of large supercomputers and make use of various levels of approximation for atomic interactions was developed to search for stable atomic clusters and predict novel stable endohedral fullerenes. As an application of the developed computational toolkit, a search was conducted for stable isomers of Sc3N C80 fullerene. In this search, about 1.2 million isomers of C80 were optimized in various charged states at the PM6 level. Subsequently, using the selected optimized isomers of C80 in various charged state, about 10,000 isomers of Sc3N C80 were constructed which were optimized using semi-empirical PM6 quantum chemical method. A few selected lowest isomers of Sc3N C80 were optimized at the DFT level. The calculation confirms the lowest 3 isomers previously reported in literature but 4 new isomers are found within the lowest 10 isomers. Using the upgraded NRLMOL code, a study was done of the electronic structure of a multichromoric molecular complex containing two of each borondipyrromethane dye, Zn-tetraphenyl-porphyrin, bisphenyl anthracene and a fullerene. A systematic examination of the effect of

  9. The FLUKA Code: an Overview

    SciTech Connect

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M.V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P.R.; /Milan U. /INFN, Milan /Pavia U. /INFN, Pavia /CERN /Siegen U. /Houston U. /SLAC /Frascati /NASA, Houston /ENEA, Frascati

    2005-11-09

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  10. High Order Modulation Protograph Codes

    NASA Technical Reports Server (NTRS)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  11. Astrophysics Source Code Library Enhancements

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.

    2015-09-01

    The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.

  12. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P. R.; Scannicchio, D.; Trovati, S.; Villari, R.; Wilson, T.

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  13. Golay and other box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    The (24,12;8) extended Golay Code can be generated as a 6 x 4 binary matrix from the (15,11;3) BCH-Hamming Code, represented as a 5 x 3 matrix, by adding a row and a column, both of odd or even parity. The odd-parity case provides the additional 12th dimension. Furthermore, any three columns and five rows of the 6 x 4 Golay form a BCH-Hamming (15,11;3) Code. Similarly a (80,58;8) code can be generated as a 10 x 8 binary matrix from the (63,57;3) BCH-Hamming Code represented as a 9 x 7 matrix by adding a row and a column both of odd and even parity. Furthermore, any seven columns along with the top nine rows is a BCH-Hamming (53,57;3) Code. A (80,40;16) 10 x 8 matrix binary code with weight structure identical to the extended (80,40;16) Quadratic Residue Code is generated from a (63,39;7) binary cyclic code represented as a 9 x 7 matrix, by adding a row and a column, both of odd or even parity.

  14. Golay and other box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    The (24,12;8) extended Golay Code can be generated as a 6x4 binary matrix from the (15,11;3) BCH-Hamming Code, represented as a 5 x 3 matrix, by adding a row and a column, both of odd or even parity. The odd-parity case provides the additional 12th dimension. Furthermore, any three columns and five rows of the 6 x 4 Golay form a BCH-Hamming (15,11;3) Code. Similarly a (80,58;8) code can be generated as a 10 x 8 binary matrix from the (63,57;3) BCH-Hamming Code represented as a 9 x 7 matrix by adding a row and a column both of odd and even parity. Furthermore, any seven columns along with the top nine rows is a BCH-Hamming (63,57;3) Code. A (80,40;16) 10 x 8 matrix binary code with weight structure identical to the extended (80,40;16) Quadratic Residue Code is generated from a (63,39;7) binary cyclic code represented as a 9 x 7 matrix, by adding a row and a column, both of odd or even parity.

  15. New Codes for Ambient Seismic Noise Analysis

    NASA Astrophysics Data System (ADS)

    Duret, F.; Mooney, W. D.; Detweiler, S.

    2007-12-01

    In order to determine a velocity model of the crust, scientists generally use earthquakes recorded by seismic stations. However earthquakes do not occur continuously and most are too weak to be useful. When no event is recorded, a waveform is generally considered to be noise. This noise, however, is not useless and carries a wealth of information. Thus, ambient seismic noise analysis is an inverse method of investigating the Earth's interior. Until recently, this technique was quite difficult to apply, as it requires significant computing capacities. In early 2007, however, a team led by Gregory Benson and Mike Ritzwoller from UC Boulder published a paper describing a new method for extracting group and phase velocities from those waveforms. The analysis consisting of recovering Green functions between a pair of stations, is composed of four steps: 1) single station data preparation, 2) cross-correlation and stacking, 3) quality control and data selection and 4) dispersion measurements. At the USGS, we developed a set of ready-to-use computing codes for analyzing waveforms to run the ambient noise analysis of Benson et al. (2007). Our main contribution to the analysis technique was to fully automate the process. The computation codes were written in Fortran 90 and the automation scripts were written in Perl. Furthermore, some operations were run with SAC. Our choices of programming language offer an opportunity to adapt our codes to the major platforms. The codes were developed under Linux but are meant to be adapted to Mac OS X and Windows platforms. The codes have been tested on Southern California data and our results compare nicely with those from the UC Boulder team. Next, we plan to apply our codes to Indonesian data, so that we might take advantage of newly upgraded seismic stations in that region.

  16. The KIDTALK Behavior and Language Code: Manual and Coding Protocol.

    ERIC Educational Resources Information Center

    Delaney, Elizabeth M.; Ezell, Sara S.; Solomon, Ned A.; Hancock, Terry B.; Kaiser, Ann P.

    Developed as part of the Milieu Language Teaching Project at the John F. Kennedy Center at Vanderbilt University in Nashville, Tennessee, this KIDTALK Behavior-Language Coding Protocol and manual measures behavior occurring during adult-child interactions. The manual is divided into 5 distinct sections: (1) the adult behavior codes describe…

  17. Patched Conic Trajectory Code

    NASA Technical Reports Server (NTRS)

    Park, Brooke Anderson; Wright, Henry

    2012-01-01

    PatCon code was developed to help mission designers run trade studies on launch and arrival times for any given planet. Initially developed in Fortran, the required inputs included launch date, arrival date, and other orbital parameters of the launch planet and arrival planets at the given dates. These parameters include the position of the planets, the eccentricity, semi-major axes, argument of periapsis, ascending node, and inclination of the planets. With these inputs, a patched conic approximation is used to determine the trajectory. The patched conic approximation divides the planetary mission into three parts: (1) the departure phase, in which the two relevant bodies are Earth and the spacecraft, and where the trajectory is a departure hyperbola with Earth at the focus; (2) the cruise phase, in which the two bodies are the Sun and the spacecraft, and where the trajectory is a transfer ellipse with the Sun at the focus; and (3) the arrival phase, in which the two bodies are the target planet and the spacecraft, where the trajectory is an arrival hyperbola with the planet as the focus.

  18. nMHDust: A 4-Fluid Partially Ionized Dusty Plasma Code

    NASA Astrophysics Data System (ADS)

    Lazerson, Samuel

    2008-11-01

    nMHDust is a next generation 4-fluid partially ionized magnetized dusty plasma code, treating the inertial dynamics of dust, ion and neutral components. Coded in ANSI C, the numerical method is based on the MHDust 3-fluid fully ionized dusty plasma code. This code expands the features of the MHDust code to include ionization/recombination effects and the netCDF data format. Tests of this code include: ionization instabilities, wave mode propagation (electromagnetic and acoustic), shear-flow instabilities, and magnetic reconnection. Relevant parameters for the space environment are considered, allowing a comparison to be made with previous dusty plasma codes (MHDust and DENISIS). The utility of the code is expanded through the possibility of a small dust mass. This allows nMHDust to be used as a 2-ion plasma code. nMHDust completes the array of fluid dusty plasma codes available for numerical investigations into nonlinear phenomena in the field of astrophysical dusty plasmas.

  19. Effective Practice in the Design of Directed Independent Learning Opportunities

    ERIC Educational Resources Information Center

    Thomas, Liz; Jones, Robert; Ottaway, James

    2015-01-01

    This study, commissioned by the HEA and the QAA focuses on directed independent learning practices in UK higher education. It investigates what stakeholders (including academic staff and students) have found to be the most effective practices in the inception, design, quality assurance and enhancement of directed independent learning and explores…

  20. A benchmark study for glacial isostatic adjustment codes

    NASA Astrophysics Data System (ADS)

    Spada, G.; Barletta, V. R.; Klemann, V.; Riva, R. E. M.; Martinec, Z.; Gasperini, P.; Lund, B.; Wolf, D.; Vermeersen, L. L. A.; King, M. A.

    2011-04-01

    The study of glacial isostatic adjustment (GIA) is gaining an increasingly important role within the geophysical community. Understanding the response of the Earth to loading is crucial in various contexts, ranging from the interpretation of modern satellite geodetic measurements (e.g. GRACE and GOCE) to the projections of future sea level trends in response to climate change. Modern modelling approaches to GIA are based on various techniques that range from purely analytical formulations to fully numerical methods. Despite various teams independently investigating GIA, we do not have a suitably large set of agreed numerical results through which the methods may be validated; a community benchmark data set would clearly be valuable. Following the example of the mantle convection community, here we present, for the first time, the results of a benchmark study of codes designed to model GIA. This has taken place within a collaboration facilitated through European Cooperation in Science and Technology (COST) Action ES0701. The approaches benchmarked are based on significantly different codes and different techniques. The test computations are based on models with spherical symmetry and Maxwell rheology and include inputs from different methods and solution techniques: viscoelastic normal modes, spectral-finite elements and finite elements. The tests involve the loading and tidal Love numbers and their relaxation spectra, the deformation and gravity variations driven by surface loads characterized by simple geometry and time history and the rotational fluctuations in response to glacial unloading. In spite of the significant differences in the numerical methods employed, the test computations show a satisfactory agreement between the results provided by the participants.

  1. Civil Code, 11 December 1987.

    PubMed

    1988-01-01

    Article 162 of this Mexican Code provides, among other things, that "Every person has the right freely, responsibly, and in an informed fashion to determine the number and spacing of his or her children." When a marriage is involved, this right is to be observed by the spouses "in agreement with each other." The civil codes of the following states contain the same provisions: 1) Baja California (Art. 159 of the Civil Code of 28 April 1972 as revised in Decree No. 167 of 31 January 1974); 2) Morelos (Art. 255 of the Civil Code of 26 September 1949 as revised in Decree No. 135 of 29 December 1981); 3) Queretaro (Art. 162 of the Civil Code of 29 December 1950 as revised in the Act of 9 January 1981); 4) San Luis Potosi (Art. 147 of the Civil Code of 24 March 1946 as revised in 13 June 1978); Sinaloa (Art. 162 of the Civil Code of 18 June 1940 as revised in Decree No. 28 of 14 October 1975); 5) Tamaulipas (Art. 146 of the Civil Code of 21 November 1960 as revised in Decree No. 20 of 30 April 1975); 6) Veracruz-Llave (Art. 98 of the Civil Code of 1 September 1932 as revised in the Act of 30 December 1975); and 7) Zacatecas (Art. 253 of the Civil Code of 9 February 1965 as revised in Decree No. 104 of 13 August 1975). The Civil Codes of Puebla and Tlaxcala provide for this right only in the context of marriage with the spouses in agreement. See Art. 317 of the Civil Code of Puebla of 15 April 1985 and Article 52 of the Civil Code of Tlaxcala of 31 August 1976 as revised in Decree No. 23 of 2 April 1984. The Family Code of Hidalgo requires as a formality of marriage a certification that the spouses are aware of methods of controlling fertility, responsible parenthood, and family planning. In addition, Article 22 the Civil Code of the Federal District provides that the legal capacity of natural persons is acquired at birth and lost at death; however, from the moment of conception the individual comes under the protection of the law, which is valid with respect to the

  2. New opportunities seen for independents

    SciTech Connect

    Adams, G.A. )

    1990-10-22

    The collapse of gas and oil prices in the mid-1980s significantly reduced the number of independent exploration companies. At the same time, a fundamental shift occurred among major oil companies as they allocated their exploration budgets toward international operations and made major production purchases. Several large independents also embraced a philosophy of budget supplementation through joint venture partnership arrangements. This has created a unique and unusual window of opportunity for the smaller independents (defined for this article as exploration and production companies with a market value of less than $1 billion) to access the extensive and high quality domestic prospect inventories of the major and large independent oil and gas companies and to participate in the search for large reserve targets on attractive joint venture terms. Participation in these types of joint ventures, in conjunction with internally generated plays selected through the use of today's advanced technology (computer-enhanced, high-resolution seismic; horizontal drilling; etc.) and increasing process for oil and natural gas, presents the domestic exploration-oriented independent with an attractive money-making opportunity for the 1990s.

  3. Cracking the bioelectric code

    PubMed Central

    Tseng, AiSun; Levin, Michael

    2013-01-01

    Patterns of resting potential in non-excitable cells of living tissue are now known to be instructive signals for pattern formation during embryogenesis, regeneration and cancer suppression. The development of molecular-level techniques for tracking ion flows and functionally manipulating the activity of ion channels and pumps has begun to reveal the mechanisms by which voltage gradients regulate cell behaviors and the assembly of complex large-scale structures. A recent paper demonstrated that a specific voltage range is necessary for demarcation of eye fields in the frog embryo. Remarkably, artificially setting other somatic cells to the eye-specific voltage range resulted in formation of eyes in aberrant locations, including tissues that are not in the normal anterior ectoderm lineage: eyes could be formed in the gut, on the tail, or in the lateral plate mesoderm. These data challenge the existing models of eye fate restriction and tissue competence maps, and suggest the presence of a bioelectric code—a mapping of physiological properties to anatomical outcomes. This Addendum summarizes the current state of knowledge in developmental bioelectricity, proposes three possible interpretations of the bioelectric code that functionally maps physiological states to anatomical outcomes, and highlights the biggest open questions in this field. We also suggest a speculative hypothesis at the intersection of cognitive science and developmental biology: that bioelectrical signaling among non-excitable cells coupled by gap junctions simulates neural network-like dynamics, and underlies the information processing functions required by complex pattern formation in vivo. Understanding and learning to control the information stored in physiological networks will have transformative implications for developmental biology, regenerative medicine and synthetic bioengineering. PMID:23802040

  4. Breaking the Code of Silence.

    ERIC Educational Resources Information Center

    Halbig, Wolfgang W.

    2000-01-01

    Schools and communities must break the adolescent code of silence concerning threats of violence. Schools need character education stressing courage, caring, and responsibility; regular discussions of the school discipline code; formal security discussions with parents; 24-hour hotlines; and protocols for handling reports of potential violence.…

  5. ACCELERATION PHYSICS CODE WEB REPOSITORY.

    SciTech Connect

    WEI, J.

    2006-06-26

    In the framework of the CARE HHH European Network, we have developed a web-based dynamic accelerator-physics code repository. We describe the design, structure and contents of this repository, illustrate its usage, and discuss our future plans, with emphasis on code benchmarking.

  6. Accelerator Physics Code Web Repository

    SciTech Connect

    Zimmermann, F.; Basset, R.; Bellodi, G.; Benedetto, E.; Dorda, U.; Giovannozzi, M.; Papaphilippou, Y.; Pieloni, T.; Ruggiero, F.; Rumolo, G.; Schmidt, F.; Todesco, E.; Zotter, B.W.; Payet, J.; Bartolini, R.; Farvacque, L.; Sen, T.; Chin, Y.H.; Ohmi, K.; Oide, K.; Furman, M.; /LBL, Berkeley /Oak Ridge /Pohang Accelerator Lab. /SLAC /TRIUMF /Tech-X, Boulder /UC, San Diego /Darmstadt, GSI /Rutherford /Brookhaven

    2006-10-24

    In the framework of the CARE HHH European Network, we have developed a web-based dynamic accelerator-physics code repository. We describe the design, structure and contents of this repository, illustrate its usage, and discuss our future plans, with emphasis on code benchmarking.

  7. LFSC - Linac Feedback Simulation Code

    SciTech Connect

    Ivanov, Valentin; /Fermilab

    2008-05-01

    The computer program LFSC (Code>) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output.

  8. Indices for Testing Neural Codes

    PubMed Central

    Victor, Jonathan D.; Nirenberg, Sheila

    2009-01-01

    One of the most critical challenges in systems neuroscience is determining the neural code. A principled framework for addressing this can be found in information theory. With this approach, one can determine whether a proposed code can account for the stimulus-response relationship. Specifically, one can compare the transmitted information between the stimulus and the hypothesized neural code with the transmitted information between the stimulus and the behavioral response. If the former is smaller than the latter (i.e., if the code cannot account for the behavior), the code can be ruled out. The information-theoretic index most widely used in this context is Shannon’s mutual information. The Shannon test, however, is not ideal for this purpose: while the codes it will rule out are truly nonviable, there will be some nonviable codes that it will fail to rule out. Here we describe a wide range of alternative indices that can be used for ruling codes out. The range includes a continuum from Shannon information to measures of the performance of a Bayesian decoder. We analyze the relationship of these indices to each other and their complementary strengths and weaknesses for addressing this problem. PMID:18533812

  9. Population coding in somatosensory cortex.

    PubMed

    Petersen, Rasmus S; Panzeri, Stefano; Diamond, Mathew E

    2002-08-01

    Computational analyses have begun to elucidate which components of somatosensory cortical population activity may encode basic stimulus features. Recent results from rat barrel cortex suggest that the essence of this code is not synergistic spike patterns, but rather the precise timing of single neuron's first post-stimulus spikes. This may form the basis for a fast, robust population code.

  10. QPhiX Code Generator

    SciTech Connect

    Joo, Balint

    2014-09-16

    A simple code-generator to generate the low level code kernels used by the QPhiX Library for Lattice QCD. Generates Kernels for Wilson-Dslash, and Wilson-Clover kernels. Can be reused to write other optimized kernels for Intel Xeon Phi(tm), Intel Xeon(tm) and potentially other architectures.

  11. Using NAEYC's Code of Ethics.

    ERIC Educational Resources Information Center

    Young Children, 1995

    1995-01-01

    Considers how to deal with an ethical dilemma concerning a caregiver's dislike for a child. Recognizes that no statement in NAEYC's Code of Ethical Conduct requires that a professional must like each child, and presents some ideals and principles from the code that may guide professionals through similar situations. (BAC)

  12. Video coding with dynamic background

    NASA Astrophysics Data System (ADS)

    Paul, Manoranjan; Lin, Weisi; Lau, Chiew Tong; Lee, Bu-Sung

    2013-12-01

    Motion estimation (ME) and motion compensation (MC) using variable block size, sub-pixel search, and multiple reference frames (MRFs) are the major reasons for improved coding performance of the H.264 video coding standard over other contemporary coding standards. The concept of MRFs is suitable for repetitive motion, uncovered background, non-integer pixel displacement, lighting change, etc. The requirement of index codes of the reference frames, computational time in ME & MC, and memory buffer for coded frames limits the number of reference frames used in practical applications. In typical video sequences, the previous frame is used as a reference frame with 68-92% of cases. In this article, we propose a new video coding method using a reference frame [i.e., the most common frame in scene (McFIS)] generated by dynamic background modeling. McFIS is more effective in terms of rate-distortion and computational time performance compared to the MRFs techniques. It has also inherent capability of scene change detection (SCD) for adaptive group of picture (GOP) size determination. As a result, we integrate SCD (for GOP determination) with reference frame generation. The experimental results show that the proposed coding scheme outperforms the H.264 video coding with five reference frames and the two relevant state-of-the-art algorithms by 0.5-2.0 dB with less computational time.

  13. Correlation approach to identify coding regions in DNA sequences

    NASA Technical Reports Server (NTRS)

    Ossadnik, S. M.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Mantegna, R. N.; Peng, C. K.; Simons, M.; Stanley, H. E.

    1994-01-01

    Recently, it was observed that noncoding regions of DNA sequences possess long-range power-law correlations, whereas coding regions typically display only short-range correlations. We develop an algorithm based on this finding that enables investigators to perform a statistical analysis on long DNA sequences to locate possible coding regions. The algorithm is particularly successful in predicting the location of lengthy coding regions. For example, for the complete genome of yeast chromosome III (315,344 nucleotides), at least 82% of the predictions correspond to putative coding regions; the algorithm correctly identified all coding regions larger than 3000 nucleotides, 92% of coding regions between 2000 and 3000 nucleotides long, and 79% of coding regions between 1000 and 2000 nucleotides. The predictive ability of this new algorithm supports the claim that there is a fundamental difference in the correlation property between coding and noncoding sequences. This algorithm, which is not species-dependent, can be implemented with other techniques for rapidly and accurately locating relatively long coding regions in genomic sequences.

  14. Benchmarking NNWSI flow and transport codes: COVE 1 results

    SciTech Connect

    Hayden, N.K.

    1985-06-01

    The code verification (COVE) activity of the Nevada Nuclear Waste Storage Investigations (NNWSI) Project is the first step in certification of flow and transport codes used for NNWSI performance assessments of a geologic repository for disposing of high-level radioactive wastes. The goals of the COVE activity are (1) to demonstrate and compare the numerical accuracy and sensitivity of certain codes, (2) to identify and resolve problems in running typical NNWSI performance assessment calculations, and (3) to evaluate computer requirements for running the codes. This report describes the work done for COVE 1, the first step in benchmarking some of the codes. Isothermal calculations for the COVE 1 benchmarking have been completed using the hydrologic flow codes SAGUARO, TRUST, and GWVIP; the radionuclide transport codes FEMTRAN and TRUMP; and the coupled flow and transport code TRACR3D. This report presents the results of three cases of the benchmarking problem solved for COVE 1, a comparison of the results, questions raised regarding sensitivities to modeling techniques, and conclusions drawn regarding the status and numerical sensitivities of the codes. 30 refs.

  15. Codon Distribution in Error-Detecting Circular Codes.

    PubMed

    Fimmel, Elena; Strüngmann, Lutz

    2016-03-15

    In 1957, Francis Crick et al. suggested an ingenious explanation for the process of frame maintenance. The idea was based on the notion of comma-free codes. Although Crick's hypothesis proved to be wrong, in 1996, Arquès and Michel discovered the existence of a weaker version of such codes in eukaryote and prokaryote genomes, namely the so-called circular codes. Since then, circular code theory has invariably evoked great interest and made significant progress. In this article, the codon distributions in maximal comma-free, maximal self-complementary C³ and maximal self-complementary circular codes are discussed, i.e., we investigate in how many of such codes a given codon participates. As the main (and surprising) result, it is shown that the codons can be separated into very few classes (three, or five, or six) with respect to their frequency. Moreover, the distribution classes can be hierarchically ordered as refinements from maximal comma-free codes via maximal self-complementary C(3) codes to maximal self-complementary circular codes.

  16. Codon Distribution in Error-Detecting Circular Codes

    PubMed Central

    Fimmel, Elena; Strüngmann, Lutz

    2016-01-01

    In 1957, Francis Crick et al. suggested an ingenious explanation for the process of frame maintenance. The idea was based on the notion of comma-free codes. Although Crick’s hypothesis proved to be wrong, in 1996, Arquès and Michel discovered the existence of a weaker version of such codes in eukaryote and prokaryote genomes, namely the so-called circular codes. Since then, circular code theory has invariably evoked great interest and made significant progress. In this article, the codon distributions in maximal comma-free, maximal self-complementary C3 and maximal self-complementary circular codes are discussed, i.e., we investigate in how many of such codes a given codon participates. As the main (and surprising) result, it is shown that the codons can be separated into very few classes (three, or five, or six) with respect to their frequency. Moreover, the distribution classes can be hierarchically ordered as refinements from maximal comma-free codes via maximal self-complementary C3 codes to maximal self-complementary circular codes. PMID:26999215

  17. Image Coding Based on Address Vector Quantization.

    NASA Astrophysics Data System (ADS)

    Feng, Yushu

    Image coding is finding increased application in teleconferencing, archiving, and remote sensing. This thesis investigates the potential of Vector Quantization (VQ), a relatively new source coding technique, for compression of monochromatic and color images. Extensions of the Vector Quantization technique to the Address Vector Quantization method have been investigated. In Vector Quantization, the image data to be encoded are first processed to yield a set of vectors. A codeword from the codebook which best matches the input image vector is then selected. Compression is achieved by replacing the image vector with the index of the code-word which produced the best match, the index is sent to the channel. Reconstruction of the image is done by using a table lookup technique, where the label is simply used as an address for a table containing the representative vectors. A code-book of representative vectors (codewords) is generated using an iterative clustering algorithm such as K-means, or the generalized Lloyd algorithm. A review of different Vector Quantization techniques are given in chapter 1. Chapter 2 gives an overview of codebook design methods including the Kohonen neural network to design codebook. During the encoding process, the correlation of the address is considered and Address Vector Quantization is developed for color image and monochrome image coding. Address VQ which includes static and dynamic processes is introduced in chapter 3. In order to overcome the problems in Hierarchical VQ, Multi-layer Address Vector Quantization is proposed in chapter 4. This approach gives the same performance as that of the normal VQ scheme but the bit rate is about 1/2 to 1/3 as that of the normal VQ method. In chapter 5, a Dynamic Finite State VQ based on a probability transition matrix to select the best subcodebook to encode the image is developed. In chapter 6, a new adaptive vector quantization scheme, suitable for color video coding, called "A Self -Organizing

  18. DANTSYS: A diffusion accelerated neutral particle transport code system

    SciTech Connect

    Alcouffe, R.E.; Baker, R.S.; Brinkley, F.W.; Marr, D.R.; O`Dell, R.D.; Walters, W.F.

    1995-06-01

    The DANTSYS code package includes the following transport codes: ONEDANT, TWODANT, TWODANT/GQ, TWOHEX, and THREEDANT. The DANTSYS code package is a modular computer program package designed to solve the time-independent, multigroup discrete ordinates form of the boltzmann transport equation in several different geometries. The modular construction of the package separates the input processing, the transport equation solving, and the post processing (or edit) functions into distinct code modules: the Input Module, one or more Solver Modules, and the Edit Module, respectively. The Input and Edit Modules are very general in nature and are common to all the Solver Modules. The ONEDANT Solver Module contains a one-dimensional (slab, cylinder, and sphere), time-independent transport equation solver using the standard diamond-differencing method for space/angle discretization. Also included in the package are solver Modules named TWODANT, TWODANT/GQ, THREEDANT, and TWOHEX. The TWODANT Solver Module solves the time-independent two-dimensional transport equation using the diamond-differencing method for space/angle discretization. The authors have also introduced an adaptive weighted diamond differencing (AWDD) method for the spatial and angular discretization into TWODANT as an option. The TWOHEX Solver Module solves the time-independent two-dimensional transport equation on an equilateral triangle spatial mesh. The THREEDANT Solver Module solves the time independent, three-dimensional transport equation for XYZ and RZ{Theta} symmetries using both diamond differencing with set-to-zero fixup and the AWDD method. The TWODANT/GQ Solver Module solves the 2-D transport equation in XY and RZ symmetries using a spatial mesh of arbitrary quadrilaterals. The spatial differencing method is based upon the diamond differencing method with set-to-zero fixup with changes to accommodate the generalized spatial meshing.

  19. PARAVT: Parallel Voronoi tessellation code

    NASA Astrophysics Data System (ADS)

    González, R. E.

    2016-10-01

    In this study, we present a new open source code for massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition takes into account consistent boundary computation between tasks, and includes periodic conditions. In addition, the code computes neighbors list, Voronoi density, Voronoi cell volume, density gradient for each particle, and densities on a regular grid. Code implementation and user guide are publicly available at https://github.com/regonzar/paravt.

  20. Recent advances to NEC (Numerical Electromagnetics Code): Applications and validation

    SciTech Connect

    Burke, G.J. )

    1989-03-03

    Capabilities of the antenna modeling code NEC are reviewed and results are presented to illustrate typical applications. Recent developments are discussed that will improve accuracy in modeling electrically small antennas, stepped-radius wires and junctions of tightly coupled wires, and also a new capability for modeling insulated wires in air or earth is described. These advances will be included in a future release of NEC, while for now the results serve to illustrate limitations of the present code. NEC results are compared with independent analytical and numerical solutions and measurements to validate the model for wires near ground and for insulated wires. 41 refs., 26 figs., 1 tab.