Science.gov

Sample records for investigators independently coded

  1. The Independent Investigation Method.

    ERIC Educational Resources Information Center

    Morse, Virginia; Nottage, Cindy

    2001-01-01

    The Independent Investigation Method is presented, a model developed to enable gifted students to grow in their independence as they move through a research assignment. The seven-step process guides elementary students from beginning to end in the research process, including topic, goal setting, research, organizing, goal evaluation, product, and…

  2. Evidence for modality-independent order coding in working memory.

    PubMed

    Depoorter, Ann; Vandierendonck, André

    2009-03-01

    The aim of the present study was to investigate the representation of serial order in working memory, more specifically whether serial order is coded by means of a modality-dependent or a modality-independent order code. This was investigated by means of a series of four experiments based on a dual-task methodology in which one short-term memory task was embedded between the presentation and recall of another short-term memory task. Two aspects were varied in these memory tasks--namely, the modality of the stimulus materials (verbal or visuo-spatial) and the presence of an order component in the task (an order or an item memory task). The results of this study showed impaired primary-task recognition performance when both the primary and the embedded task included an order component, irrespective of the modality of the stimulus materials. If one or both of the tasks did not contain an order component, less interference was found. The results of this study support the existence of a modality-independent order code. PMID:18609385

  3. Independent rate and temporal coding in hippocampal pyramidal cells

    PubMed Central

    Huxter, John; Burgess, Neil; O’Keefe, John

    2009-01-01

    Hippocampal pyramidal cells use temporal 1 as well as rate coding 2 to signal spatial aspects of the animal’s environment or behaviour. The temporal code takes the form of a phase relationship to the concurrent cycle of the hippocampal EEG theta rhythm (Figure 1​; 1). These two codes could each represent a different variable 3,4. However, this requires that rate and phase can vary independently, in contrast to recent suggestions 5,6 that they are tightly coupled: both reflecting the amplitude of the cell’s input. Here we show that the time of firing and firing rate are dissociable and can represent two independent variables, viz, the animal’s location within the place field and its speed of movement through the field, respectively. Independent encoding of location together with actions and stimuli occurring there may help to explain the dual roles of the hippocampus in spatial and episodic memory 7 8 or a more general role in relational/declarative memory9,10. PMID:14574410

  4. Benchmark testing and independent verification of the VS2DT computer code

    SciTech Connect

    McCord, J.T.; Goodrich, M.T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code`s original documentation.

  5. Investigation of Near Shannon Limit Coding Schemes

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Kim, J.; Mo, Fan

    1999-01-01

    Turbo codes can deliver performance that is very close to the Shannon limit. This report investigates algorithms for convolutional turbo codes and block turbo codes. Both coding schemes can achieve performance near Shannon limit. The performance of the schemes is obtained using computer simulations. There are three sections in this report. First section is the introduction. The fundamental knowledge about coding, block coding and convolutional coding is discussed. In the second section, the basic concepts of convolutional turbo codes are introduced and the performance of turbo codes, especially high rate turbo codes, is provided from the simulation results. After introducing all the parameters that help turbo codes achieve such a good performance, it is concluded that output weight distribution should be the main consideration in designing turbo codes. Based on the output weight distribution, the performance bounds for turbo codes are given. Then, the relationships between the output weight distribution and the factors like generator polynomial, interleaver and puncturing pattern are examined. The criterion for the best selection of system components is provided. The puncturing pattern algorithm is discussed in detail. Different puncturing patterns are compared for each high rate. For most of the high rate codes, the puncturing pattern does not show any significant effect on the code performance if pseudo - random interleaver is used in the system. For some special rate codes with poor performance, an alternative puncturing algorithm is designed which restores their performance close to the Shannon limit. Finally, in section three, for iterative decoding of block codes, the method of building trellis for block codes, the structure of the iterative decoding system and the calculation of extrinsic values are discussed.

  6. Independent accident investigation: a modern safety tool.

    PubMed

    Stoop, John A

    2004-07-26

    Historically, safety has been subjected to a fragmented approach. In the past, every department has had its own responsibility towards safety, focusing either on working conditions, internal safety, external safety, rescue and emergency, public order or security. They each issued policy documents, which in their time were leading statements for elaboration and regulation. They also addressed safety issues with tools of various nature, often specifically developed within their domain. Due to a series of major accidents and disasters, the focus of attention is shifting from complying with quantitative risk standards towards intervention in primary operational processes, coping with systemic deficiencies and a more integrated assessment of safety in its societal context. In The Netherlands recognition of the importance of independent investigations has led to an expansion of this philosophy from the transport sector to other sectors. The philosophy now covers transport, industry, defense, natural disaster, environment and health and other major occurrences such as explosions, fires, and collapse of buildings or structures. In 2003 a multi-sector covering law will establish an independent safety board in The Netherlands. At a European level, mandatory investigation agencies are recognized as indispensable safety instruments for aviation, railways and the maritime sector, for which EU Directives are in place or being progressed [Transport accident and incident investigation in the European Union, European Transport Safety Council, ISBN 90-76024-10-3, Brussel, 2001]. Due to a series of major events, attention has been drawn to the consequences of disasters, highlighting the involvement of rescue and emergency services. They also have become subjected to investigative efforts, which in return, puts demands on investigation methodology. This paper comments on an evolutionary development in safety thinking and of safety boards, highlighting some consequences for strategic perspectives in a further development of independent accident investigation. PMID:15231346

  7. Quantum image coding with a reference-frame-independent scheme

    NASA Astrophysics Data System (ADS)

    Chapeau-Blondeau, François; Belin, Etienne

    2016-04-01

    For binary images, or bit planes of non-binary images, we investigate the possibility of a quantum coding decodable by a receiver in the absence of reference frames shared with the emitter. Direct image coding with one qubit per pixel and non-aligned frames leads to decoding errors equivalent to a quantum bit-flip noise increasing with the misalignment. We show the feasibility of frame-invariant coding by using for each pixel a qubit pair prepared in one of two controlled entangled states. With just one common axis shared between the emitter and receiver, exact decoding for each pixel can be obtained by means of two two-outcome projective measurements operating separately on each qubit of the pair. With strictly no alignment information between the emitter and receiver, exact decoding can be obtained by means of a two-outcome projective measurement operating jointly on the qubit pair. In addition, the frame-invariant coding is shown much more resistant to quantum bit-flip noise compared to the direct non-invariant coding. For a cost per pixel of two (entangled) qubits instead of one, complete frame-invariant image coding and enhanced noise resistance are thus obtained.

  8. Implementation of context independent code on a new array processor: The Super-65

    NASA Technical Reports Server (NTRS)

    Colbert, R. O.; Bowhill, S. A.

    1981-01-01

    The feasibility of rewriting standard uniprocessor programs into code which contains no context-dependent branches is explored. Context independent code (CIC) would contain no branches that might require different processing elements to branch different ways. In order to investigate the possibilities and restrictions of CIC, several programs were recoded into CIC and a four-element array processor was built. This processor (the Super-65) consisted of three 6502 microprocessors and the Apple II microcomputer. The results obtained were somewhat dependent upon the specific architecture of the Super-65 but within bounds, the throughput of the array processor was found to increase linearly with the number of processing elements (PEs). The slope of throughput versus PEs is highly dependent on the program and varied from 0.33 to 1.00 for the sample programs.

  9. Two independent transcription initiation codes overlap on vertebrate core promoters

    NASA Astrophysics Data System (ADS)

    Haberle, Vanja; Li, Nan; Hadzhiev, Yavor; Plessy, Charles; Previti, Christopher; Nepal, Chirag; Gehrig, Jochen; Dong, Xianjun; Akalin, Altuna; Suzuki, Ana Maria; van Ijcken, Wilfred F. J.; Armant, Olivier; Ferg, Marco; Strähle, Uwe; Carninci, Piero; Müller, Ferenc; Lenhard, Boris

    2014-03-01

    A core promoter is a stretch of DNA surrounding the transcription start site (TSS) that integrates regulatory inputs and recruits general transcription factors to initiate transcription. The nature and causative relationship of the DNA sequence and chromatin signals that govern the selection of most TSSs by RNA polymerase II remain unresolved. Maternal to zygotic transition represents the most marked change of the transcriptome repertoire in the vertebrate life cycle. Early embryonic development in zebrafish is characterized by a series of transcriptionally silent cell cycles regulated by inherited maternal gene products: zygotic genome activation commences at the tenth cell cycle, marking the mid-blastula transition. This transition provides a unique opportunity to study the rules of TSS selection and the hierarchy of events linking transcription initiation with key chromatin modifications. We analysed TSS usage during zebrafish early embryonic development at high resolution using cap analysis of gene expression, and determined the positions of H3K4me3-marked promoter-associated nucleosomes. Here we show that the transition from the maternal to zygotic transcriptome is characterized by a switch between two fundamentally different modes of defining transcription initiation, which drive the dynamic change of TSS usage and promoter shape. A maternal-specific TSS selection, which requires an A/T-rich (W-box) motif, is replaced with a zygotic TSS selection grammar characterized by broader patterns of dinucleotide enrichments, precisely aligned with the first downstream (+1) nucleosome. The developmental dynamics of the H3K4me3-marked nucleosomes reveal their DNA-sequence-associated positioning at promoters before zygotic transcription and subsequent transcription-independent adjustment to the final position downstream of the zygotic TSS. The two TSS-defining grammars coexist, often physically overlapping, in core promoters of constitutively expressed genes to enable their expression in the two regulatory environments. The dissection of overlapping core promoter determinants represents a framework for future studies of promoter structure and function across different regulatory contexts.

  10. Two independent transcription initiation codes overlap on vertebrate core promoters

    PubMed Central

    Hadzhiev, Yavor; Plessy, Charles; Previti, Christopher; Nepal, Chirag; Gehrig, Jochen; Dong, Xianjun; Akalin, Altuna; Suzuki, Ana Maria; van IJcken, Wilfred F.J.; Armant, Olivier; Ferg, Marco; Strähle, Uwe; Carninci, Piero; Müller, Ferenc; Lenhard, Boris

    2014-01-01

    A core promoter is a stretch of DNA surrounding the transcription start site (TSS) that integrates regulatory inputs1 and recruits general transcription factors to initiate transcription2. The nature and causative relationship of DNA sequence and chromatin signals that govern the selection of most TSS by RNA polymerase II remain unresolved. Maternal to zygotic transition (MZT) represents the most dramatic change of the transcriptome repertoire in vertebrate life cycle3-6. Early embryonic development in zebrafish is characterized by a series of transcriptionally silent cell cycles regulated by inherited maternal gene products: zygotic genome activation commences at the 10th cell cycle, marking the midblastula transition (MBT)7. This transition provides a unique opportunity to study the rules of TSS selection and the hierarchy of events linking transcription initiation with key chromatin modifications. We analysed TSS usage during zebrafish early embryonic development at high resolution using cap analysis of gene expression (CAGE)8 and determined the positions of H3K4me3-marked promoter-associated nucleosomes9. We show that the transition from maternal to zygotic transcriptome is characterised by a switch between two fundamentally different modes of defining transcription initiation, which drive the dynamic change of TSS usage and promoter shape. A maternal-specific TSS selection, which requires an A/T-rich (W-box) motif, is replaced with a zygotic TSS selection grammar characterized by broader patterns of dinucleotide enrichments, precisely aligned with the first downstream (+1) nucleosome. The developmental dynamics of the H3K4me3-marked nucleosomes reveals their DNA sequence-associated positioning at promoters prior to zygotic transcription and subsequent transcription-independent adjustment to the final position downstream of zygotic TSS. The two TSS-defining grammars coexist often in physical overlap in core promoters of constitutively expressed genes to enable their expression in the two regulatory environments. The dissection of overlapping core promoter determinants represents a framework for future studies of promoter structure and function across different regulatory contexts. PMID:24531765

  11. Independent verification and benchmark testing of the UNSAT-H computer code, Version 2.0

    SciTech Connect

    Baca, R.G.; Magnuson, S.O.

    1990-02-01

    Independent testing of the UNSAT-H computer code, Version 2.0, was conducted to establish confidence that the code is ready for general use in performance assessment applications. Verification and benchmark test problems were used to check the correctness of the FORTRAN coding, computational efficiency and accuracy of the numerical algorithm, and code, capability to simulate diverse hydrologic conditions. This testing was performed using a structured and quantitative evaluation protocol. The protocol consisted of: blind testing, independent applications, maintaining test equivalence and use of graduated test cases. Graphical comparisons and calculation of the relative root mean square (RRMS) values were used as indicators of accuracy and consistency levels. Four specific ranges of RRMS values were chosen for in judging the quality of the comparison. Four verification test problems were used to check the computational accuracy of UNSAT-H in solving the uncoupled fluid flow and heat transport equations. Five benchmark test problems, ranging in complexity, were used to check the code`s simulation capability. Some of the benchmark test cases include comparisons with laboratory and field data. The primary findings of this independent testing is that the UNSAT-H is fully operationaL In general, the test results showed that computer code produced unsaturated flow simulations with excellent stability, reasonable accuracy, and acceptable speed. This report describes the technical basis, approach, and results of the independent testing. A number of future refinements to the UNSAT-H code are recommended that would improve: computational speed and accuracy, code usability and code portability. Aspects of the code that warrant further testing are outlined.

  12. Independent verification and benchmark testing of the UNSAT-H computer code, Version 2. 0

    SciTech Connect

    Baca, R.G.; Magnuson, S.O.

    1990-02-01

    Independent testing of the UNSAT-H computer code, Version 2.0, was conducted to establish confidence that the code is ready for general use in performance assessment applications. Verification and benchmark test problems were used to check the correctness of the FORTRAN coding, computational efficiency and accuracy of the numerical algorithm, and code, capability to simulate diverse hydrologic conditions. This testing was performed using a structured and quantitative evaluation protocol. The protocol consisted of: blind testing, independent applications, maintaining test equivalence and use of graduated test cases. Graphical comparisons and calculation of the relative root mean square (RRMS) values were used as indicators of accuracy and consistency levels. Four specific ranges of RRMS values were chosen for in judging the quality of the comparison. Four verification test problems were used to check the computational accuracy of UNSAT-H in solving the uncoupled fluid flow and heat transport equations. Five benchmark test problems, ranging in complexity, were used to check the code's simulation capability. Some of the benchmark test cases include comparisons with laboratory and field data. The primary findings of this independent testing is that the UNSAT-H is fully operationaL In general, the test results showed that computer code produced unsaturated flow simulations with excellent stability, reasonable accuracy, and acceptable speed. This report describes the technical basis, approach, and results of the independent testing. A number of future refinements to the UNSAT-H code are recommended that would improve: computational speed and accuracy, code usability and code portability. Aspects of the code that warrant further testing are outlined.

  13. Independent coding of object motion and position revealed by distinct contingent aftereffects

    PubMed Central

    Bulakowski, Paul F.; Koldewyn, Kami; Whitney, David

    2013-01-01

    Despite several Findings of perceptual asynchronies between object features, it remains unclear whether independent neuronal populations necessarily code these perceptually unbound properties. To examine this, we investigated the binding between an object’s spatial frequency and its rotational motion using contingent motion aftereffects (MAE). Subjects adapted to an oscillating grating whose direction of rotation was paired with a high or low spatial frequency pattern. In separate adaptation conditions, we varied the moment when the spatial frequency change occurred relative to the direction reversal. After adapting to one stimulus, subjects made judgments of either the perceived MAE (rotational movement) or the position shift (instantaneous phase rotation) that accompanied the MAE. To null the spatial frequency-contingent MAE, motion reversals had to physically lag changes in spatial frequency during adaptation. To null the position shift that accompanied the MAE, however, no temporal lag between the attributes was required. This demonstrates that perceived motion and position can be perceptually misbound. Indeed, in certain conditions, subjects perceived the test pattern to drift in one direction while its position appeared shifted in the opposite direction. The dissociation between perceived motion and position of the same test pattern, following identical adaptation, demonstrates that distinguishable neural populations code for these object properties. PMID:17280696

  14. RBMK coupled neutronics/thermal-hydraulics analyses by two independent code systems

    SciTech Connect

    Parisi, C.; D'Auria, F.; Malofeev, V.; Ivanov, B.; Ivanov, K.

    2006-07-01

    This paper presents the coupled neutronics/thermal-hydraulics activities carried out in the framework of the part B of the TACIS project R2.03/97, 'Software development for accident analysis of RBMK reactors in Russia'. Two independent code systems were assembled, one from the Russian side and the other from the Western side, for studying RBMK core transients. The Russian code system relies on the use of code UNK for neutron data libraries generation and the three-dimensional neutron kinetics thermal-hydraulics coupled codes BARS-KORSAR for plant transient analyses. The Western code system is instead based on the lattice physics code HELIOS and on the RELAP5-3D C code. Several activities were performed for testing code system's capabilities: the neutron data libraries were calculated and verified by precise Monte Carlo calculations, the coupled codes' steady state results were compared with plant detectors' data, and calculations of several transients were compared. Finally, both code systems proved to have all the capabilities for addressing reliable safety analyses of RBMK reactors. (authors)

  15. An investigation of error characteristics and coding performance

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1993-01-01

    The first year's effort on NASA Grant NAG5-2006 was an investigation to characterize typical errors resulting from the EOS dorn link. The analysis methods developed for this effort were used on test data from a March 1992 White Sands Terminal Test. The effectiveness of a concatenated coding scheme of a Reed Solomon outer code and a convolutional inner code versus a Reed Solomon only code scheme has been investigated as well as the effectiveness of a Periodic Convolutional Interleaver in dispersing errors of certain types. The work effort consisted of development of software that allows simulation studies with the appropriate coding schemes plus either simulated data with errors or actual data with errors. The software program is entitled Communication Link Error Analysis (CLEAN) and models downlink errors, forward error correcting schemes, and interleavers.

  16. Category-dependent and category-independent goal-value codes in human ventromedial prefrontal cortex.

    PubMed

    McNamee, Daniel; Rangel, Antonio; O'Doherty, John P

    2013-04-01

    To choose between manifestly distinct options, it is suggested that the brain assigns values to goals using a common currency. Although previous studies have reported activity in ventromedial prefrontal cortex (vmPFC) correlating with the value of different goal stimuli, it remains unclear whether such goal-value representations are independent of the associated stimulus categorization, as required by a common currency. Using multivoxel pattern analyses on functional magnetic resonance imaging (fMRI) data, we found a region of medial prefrontal cortex to contain a distributed goal-value code that is independent of stimulus category. More ventrally in the vmPFC, we found spatially distinct areas of the medial orbitofrontal cortex to contain unique category-dependent distributed value codes for food and consumer items. These results implicate the medial prefrontal cortex in the implementation of a common currency and suggest a ventral versus dorsal topographical organization of value signals in the vmPFC. PMID:23416449

  17. Hundreds of conserved non-coding genomic regions are independently lost in mammals

    PubMed Central

    Hiller, Michael; Schaar, Bruce T.; Bejerano, Gill

    2012-01-01

    Conserved non-protein-coding DNA elements (CNEs) often encode cis-regulatory elements and are rarely lost during evolution. However, CNE losses that do occur can be associated with phenotypic changes, exemplified by pelvic spine loss in sticklebacks. Using a computational strategy to detect complete loss of CNEs in mammalian genomes while strictly controlling for artifacts, we find >600 CNEs that are independently lost in at least two mammalian lineages, including a spinal cord enhancer near GDF11. We observed several genomic regions where multiple independent CNE loss events happened; the most extreme is the DIAPH2 locus. We show that CNE losses often involve deletions and that CNE loss frequencies are non-uniform. Similar to less pleiotropic enhancers, we find that independently lost CNEs are shorter, slightly less constrained and evolutionarily younger than CNEs without detected losses. This suggests that independently lost CNEs are less pleiotropic and that pleiotropic constraints contribute to non-uniform CNE loss frequencies. We also detected 35 CNEs that are independently lost in the human lineage and in other mammals. Our study uncovers an interesting aspect of the evolution of functional DNA in mammalian genomes. Experiments are necessary to test if these independently lost CNEs are associated with parallel phenotype changes in mammals. PMID:23042682

  18. Investigations into resting-state connectivity using independent component analysis

    PubMed Central

    Beckmann, Christian F; DeLuca, Marilena; Devlin, Joseph T; Smith, Stephen M

    2005-01-01

    Inferring resting-state connectivity patterns from functional magnetic resonance imaging (fMRI) data is a challenging task for any analytical technique. In this paper, we review a probabilistic independent component analysis (PICA) approach, optimized for the analysis of fMRI data, and discuss the role which this exploratory technique can take in scientific investigations into the structure of these effects. We apply PICA to fMRI data acquired at rest, in order to characterize the spatio-temporal structure of such data, and demonstrate that this is an effective and robust tool for the identification of low-frequency resting-state patterns from data acquired at various different spatial and temporal resolutions. We show that these networks exhibit high spatial consistency across subjects and closely resemble discrete cortical functional networks such as visual cortical areas or sensory–motor cortex. PMID:16087444

  19. Proof-of-principle experiment of reference-frame-independent quantum key distribution with phase coding

    NASA Astrophysics Data System (ADS)

    Liang, Wen-Ye; Wang, Shuang; Li, Hong-Wei; Yin, Zhen-Qiang; Chen, Wei; Yao, Yao; Huang, Jing-Zheng; Guo, Guang-Can; Han, Zheng-Fu

    2014-01-01

    We have demonstrated a proof-of-principle experiment of reference-frame-independent phase coding quantum key distribution (RFI-QKD) over an 80-km optical fiber. After considering the finite-key bound, we still achieve a distance of 50 km. In this scenario, the phases of the basis states are related by a slowly time-varying transformation. Furthermore, we developed and realized a new decoy state method for RFI-QKD systems with weak coherent sources to counteract the photon-number-splitting attack. With the help of a reference-frame-independent protocol and a Michelson interferometer with Faraday rotator mirrors, our system is rendered immune to the slow phase changes of the interferometer and the polarization disturbances of the channel, making the procedure very robust.

  20. Proof-of-principle experiment of reference-frame-independent quantum key distribution with phase coding

    PubMed Central

    Liang, Wen-Ye; Wang, Shuang; Li, Hong-Wei; Yin, Zhen-Qiang; Chen, Wei; Yao, Yao; Huang, Jing-Zheng; Guo, Guang-Can; Han, Zheng-Fu

    2014-01-01

    We have demonstrated a proof-of-principle experiment of reference-frame-independent phase coding quantum key distribution (RFI-QKD) over an 80-km optical fiber. After considering the finite-key bound, we still achieve a distance of 50 km. In this scenario, the phases of the basis states are related by a slowly time-varying transformation. Furthermore, we developed and realized a new decoy state method for RFI-QKD systems with weak coherent sources to counteract the photon-number-splitting attack. With the help of a reference-frame-independent protocol and a Michelson interferometer with Faraday rotator mirrors, our system is rendered immune to the slow phase changes of the interferometer and the polarization disturbances of the channel, making the procedure very robust. PMID:24402550

  1. Board Governance of Independent Schools: A Framework for Investigation

    ERIC Educational Resources Information Center

    McCormick, John; Barnett, Kerry; Alavi, Seyyed Babak; Newcombe, Geoffrey

    2006-01-01

    Purpose: This paper develops a theoretical framework to guide future inquiry into board governance of independent schools. Design/methodology/approach: The authors' approach is to integrate literatures related to corporate and educational boards, motivation, leadership and group processes that are appropriate for conceptualizing independent school…

  2. Board Governance of Independent Schools: A Framework for Investigation

    ERIC Educational Resources Information Center

    McCormick, John; Barnett, Kerry; Alavi, Seyyed Babak; Newcombe, Geoffrey

    2006-01-01

    Purpose: This paper develops a theoretical framework to guide future inquiry into board governance of independent schools. Design/methodology/approach: The authors' approach is to integrate literatures related to corporate and educational boards, motivation, leadership and group processes that are appropriate for conceptualizing independent school

  3. Independence.

    ERIC Educational Resources Information Center

    Stephenson, Margaret E.

    2000-01-01

    Discusses the four planes of development and the periods of creation and crystallization within each plane. Identifies the type of independence that should be achieved by the end of the first two planes of development. Maintains that it is through individual work on the environment that one achieves independence. (KB)

  4. RELAP5/MOD3 code manual: Summaries and reviews of independent code assessment reports. Volume 7, Revision 1

    SciTech Connect

    Moore, R.L.; Sloan, S.M.; Schultz, R.R.; Wilson, G.E.

    1996-10-01

    Summaries of RELAP5/MOD3 code assessments, a listing of the assessment matrix, and a chronology of the various versions of the code are given. Results from these code assessments have been used to formulate a compilation of some of the strengths and weaknesses of the code. These results are documented in the report. Volume 7 was designed to be updated periodically and to include the results of the latest code assessments as they become available. Consequently, users of Volume 7 should ensure that they have the latest revision available.

  5. The investigation of bandwidth efficient coding and modulation techniques

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The New Mexico State University Center for Space Telemetering and Telecommunications systems has been, and is currently, engaged in the investigation of trellis-coded modulation (TCM) communication systems. In particular, TCM utilizing M-ary phase shift keying is being studied. The study of carrier synchronization in a TCM environment, or in MPSK systems in general, has been one of the two main thrusts of this grant. This study has involved both theoretical modelling and software simulation of the carrier synchronization problem.

  6. Independent assessment of TRAC and RELAP5 codes through separate effects tests

    SciTech Connect

    Saha, P.; Rohatgi, U.S.; Jo, J.H.; Neymotin, L.; Slovik, G.; Yuelys-Miksis, C.; Pu, J.

    1983-01-01

    Independent assessment of TRAC-PF1 (Version 7.0), TRAC-BD1 (Version 12.0) and RELAP5/MOD1 (Cycle 14) that was initiated at BNL in FY 1982, has been completed in FY 1983. As in the previous years, emphasis at Brookhaven has been in simulating various separate-effects tests with these advanced codes and identifying the areas where further thermal-hydraulic modeling improvements are needed. The following six catetories of tests were simulated with the above codes: (1) critical flow tests (Moby-Dick nitrogen-water, BNL flashing flow, Marviken Test 24); (2) Counter-Current Flow Limiting (CCFL) tests (University of Houston, Dartmouth College single and parallel tube test); (3) level swell tests (G.E. large vessel test); (4) steam generator tests (B and W 19-tube model S.G. tests, FLECHT-SEASET U-tube S.G. tests); (5) natural circulation tests (FRIGG loop tests); and (6) post-CHF tests (Oak Ridge steady-state test).

  7. A 2.9 ps equivalent resolution interpolating time counter based on multiple independent coding lines

    NASA Astrophysics Data System (ADS)

    Szplet, R.; Jachna, Z.; Kwiatkowski, P.; Rozyc, K.

    2013-03-01

    We present the design, operation and test results of a time counter that has an equivalent resolution of 2.9 ps, a measurement uncertainty at the level of 6 ps, and a measurement range of 10 s. The time counter has been implemented in a general-purpose reprogrammable device Spartan-6 (Xilinx). To obtain both high precision and wide measurement range the counting of periods of a reference clock is combined with a two-stage interpolation within a single period of the clock signal. The interpolation involves a four-phase clock in the first interpolation stage (FIS) and an equivalent coding line (ECL) in the second interpolation stage (SIS). The ECL is created as a compound of independent discrete time coding lines (TCL). The number of TCLs used to create the virtual ECL has an effect on its resolution. We tested ECLs made from up to 16 TCLs, but the idea may be extended to a larger number of lines. In the presented time counter the coarse resolution of the counting method equal to 2 ns (period of the 500 MHz reference clock) is firstly improved fourfold in the FIS and next even more than 400 times in the SIS. The proposed solution allows us to overcome the technological limitation in achievable resolution and improve the precision of conversion of integrated interpolators based on tapped delay lines.

  8. Independent code assessment at BNL in FY 1982. [TRAC-PF1; RELAP5/MOD1; TRAC-BD1

    SciTech Connect

    Saha, P.; Rohatgi, U.S.; Jo, J.H.; Neymotin, L.; Slovik, G.; Yuelys-Miksis, C.

    1982-01-01

    Independent assessment of the advanced codes such as TRAC and RELAP5 has continued at BNL through the Fiscal Year 1982. The simulation tests can be grouped into the following five categories: critical flow, counter-current flow limiting (CCFL) or flooding, level swell, steam generator thermal performance, and natural circulation. TRAC-PF1 (Version 7.0) and RELAP5/MOD1 (Cycle 14) codes were assessed by simulating all of the above experiments, whereas the TRAC-BD1 (Version 12.0) code was applied only to the CCFL tests. Results and conclusions of the BNL code assessment activity of FY 1982 are summarized below.

  9. Enabling Handicapped Nonreaders to Independently Obtain Information: Initial Development of an Inexpensive Bar Code Reader System.

    ERIC Educational Resources Information Center

    VanBiervliet, Alan

    A project to develop and evaluate a bar code reader system as a self-directed information and instructional aid for handicapped nonreaders is described. The bar code technology involves passing a light sensitive pen or laser over a printed code with bars which correspond to coded numbers. A system would consist of a portable device which could…

  10. High performance computing aspects of a dimension independent semi-Lagrangian discontinuous Galerkin code

    NASA Astrophysics Data System (ADS)

    Einkemmer, Lukas

    2016-05-01

    The recently developed semi-Lagrangian discontinuous Galerkin approach is used to discretize hyperbolic partial differential equations (usually first order equations). Since these methods are conservative, local in space, and able to limit numerical diffusion, they are considered a promising alternative to more traditional semi-Lagrangian schemes (which are usually based on polynomial or spline interpolation). In this paper, we consider a parallel implementation of a semi-Lagrangian discontinuous Galerkin method for distributed memory systems (so-called clusters). Both strong and weak scaling studies are performed on the Vienna Scientific Cluster 2 (VSC-2). In the case of weak scaling we observe a parallel efficiency above 0.8 for both two and four dimensional problems and up to 8192 cores. Strong scaling results show good scalability to at least 512 cores (we consider problems that can be run on a single processor in reasonable time). In addition, we study the scaling of a two dimensional Vlasov-Poisson solver that is implemented using the framework provided. All of the simulations are conducted in the context of worst case communication overhead; i.e., in a setting where the CFL (Courant-Friedrichs-Lewy) number increases linearly with the problem size. The framework introduced in this paper facilitates a dimension independent implementation of scientific codes (based on C++ templates) using both an MPI and a hybrid approach to parallelization. We describe the essential ingredients of our implementation.

  11. An investigation of error characteristics and coding performance

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1992-01-01

    The performance of forward error correcting coding schemes on errors anticipated for the Earth Observation System (EOS) Ku-band downlink are studied. The EOS transmits picture frame data to the ground via the Telemetry Data Relay Satellite System (TDRSS) to a ground-based receiver at White Sands. Due to unintentional RF interference from other systems operating in the Ku band, the noise at the receiver is non-Gaussian which may result in non-random errors output by the demodulator. That is, the downlink channel cannot be modeled by a simple memoryless Gaussian-noise channel. From previous experience, it is believed that those errors are bursty. The research proceeded by developing a computer based simulation, called Communication Link Error ANalysis (CLEAN), to model the downlink errors, forward error correcting schemes, and interleavers used with TDRSS. To date, the bulk of CLEAN was written, documented, debugged, and verified. The procedures for utilizing CLEAN to investigate code performance were established and are discussed.

  12. Investigation of combined unfolding of neutron spectra using the UMG unfolding codes.

    PubMed

    Roberts, N J

    2007-01-01

    An investigation of the simultaneous unfolding of data from neutron spectrometers using the UMG codes MAXED and GRAVEL has been performed. This approach involves combining the data from the spectrometers before unfolding, thereby performing a single combined unfolding of all the data to yield a final combined spectrum. The study used measured data from three proton recoil counters and also Bonner sphere and proton recoil counter responses calculated from their response functions. In each case, the spectrum derived from combined unfolding is compared with either the spectrum obtained from merging the independently unfolded spectra or the spectrum used to calculate the responses. The advantages and disadvantages of this technique are discussed. PMID:17502320

  13. Characterization and validation of a Monte Carlo code for independent dose calculation in proton therapy treatments with pencil beam scanning

    NASA Astrophysics Data System (ADS)

    Fracchiolla, F.; Lorentini, S.; Widesott, L.; Schwarz, M.

    2015-11-01

    We propose a method of creating and validating a Monte Carlo (MC) model of a proton Pencil Beam Scanning (PBS) machine using only commissioning measurements and avoiding the nozzle modeling. Measurements with a scintillating screen coupled with a CCD camera, ionization chamber and a Faraday Cup were used to model the beam in TOPAS without using any machine parameter information but the virtual source distance from the isocenter. Then the model was validated on simple Spread Out Bragg Peaks (SOBP) delivered in water phantom and with six realistic clinical plans (many involving 3 or more fields) on an anthropomorphic phantom. In particular the behavior of the moveable Range Shifter (RS) feature was investigated and its modeling has been proposed. The gamma analysis (3%,3 mm) was used to compare MC, TPS (XiO-ELEKTA) and measured 2D dose distributions (using radiochromic film). The MC modeling proposed here shows good results in the validation phase, both for simple irradiation geometry (SOBP in water) and for modulated treatment fields (on anthropomorphic phantoms). In particular head lesions were investigated and both MC and TPS data were compared with measurements. Treatment plans with no RS always showed a very good agreement with both of them (γ -Passing Rate (PR)  >  95%). Treatment plans in which the RS was needed were also tested and validated. For these treatment plans MC results showed better agreement with measurements (γ -PR  >  93%) than the one coming from TPS (γ -PR  <  88%). This work shows how to simplify the MC modeling of a PBS machine for proton therapy treatments without accounting for any hardware components and proposes a more reliable RS modeling than the one implemented in our TPS. The validation process has shown how this code is a valid candidate for a completely independent treatment plan dose calculation algorithm. This makes the code an important future tool for the patient specific QA verification process.

  14. Norepinephrine Modulates Coding of Complex Vocalizations in the Songbird Auditory Cortex Independent of Local Neuroestrogen Synthesis

    PubMed Central

    Ikeda, Maaya Z.; Jeon, Sung David; Cowell, Rosemary A.

    2015-01-01

    The catecholamine norepinephrine plays a significant role in auditory processing. Most studies to date have examined the effects of norepinephrine on the neuronal response to relatively simple stimuli, such as tones and calls. It is less clear how norepinephrine shapes the detection of complex syntactical sounds, as well as the coding properties of sensory neurons. Songbirds provide an opportunity to understand how auditory neurons encode complex, learned vocalizations, and the potential role of norepinephrine in modulating the neuronal computations for acoustic communication. Here, we infused norepinephrine into the zebra finch auditory cortex and performed extracellular recordings to study the modulation of song representations in single neurons. Consistent with its proposed role in enhancing signal detection, norepinephrine decreased spontaneous activity and firing during stimuli, yet it significantly enhanced the auditory signal-to-noise ratio. These effects were all mimicked by clonidine, an α-2 receptor agonist. Moreover, a pattern classifier analysis indicated that norepinephrine enhanced the ability of single neurons to accurately encode complex auditory stimuli. Because neuroestrogens are also known to enhance auditory processing in the songbird brain, we tested the hypothesis that norepinephrine actions depend on local estrogen synthesis. Neither norepinephrine nor adrenergic receptor antagonist infusion into the auditory cortex had detectable effects on local estradiol levels. Moreover, pretreatment with fadrozole, a specific aromatase inhibitor, did not block norepinephrine's neuromodulatory effects. Together, these findings indicate that norepinephrine enhances signal detection and information encoding for complex auditory stimuli by suppressing spontaneous “noise” activity and that these actions are independent of local neuroestrogen synthesis. PMID:26109659

  15. Field Dependence/Independence Cognitive Style and Problem Posing: An Investigation with Sixth Grade Students

    ERIC Educational Resources Information Center

    Nicolaou, Aristoklis Andreas; Xistouri, Xenia

    2011-01-01

    Field dependence/independence cognitive style was found to relate to general academic achievement and specific areas of mathematics; in the majority of studies, field-independent students were found to be superior to field-dependent students. The present study investigated the relationship between field dependence/independence cognitive style and…

  16. Investigation of Navier-Stokes Code Verification and Design Optimization

    NASA Technical Reports Server (NTRS)

    Vaidyanathan, Rajkumar

    2004-01-01

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization study is carried out using a geometric mean approach. Following this, sensitivity analyses with the aid of variance-based non-parametric approach and partial correlation coefficients are conducted using data available from surrogate models of the objectives and the multi-objective optima to identify the contribution of the design variables to the objective variability and to analyze the variability of the design variables and the objectives. In summary the present dissertation offers insight into an improved coarse to fine grid extrapolation technique for Navier-Stokes computations and also suggests tools for a designer to conduct design optimization study and related sensitivity analyses for a given design problem.

  17. Characterization and validation of a Monte Carlo code for independent dose calculation in proton therapy treatments with pencil beam scanning.

    PubMed

    Fracchiolla, F; Lorentini, S; Widesott, L; Schwarz, M

    2015-11-01

    We propose a method of creating and validating a Monte Carlo (MC) model of a proton Pencil Beam Scanning (PBS) machine using only commissioning measurements and avoiding the nozzle modeling. Measurements with a scintillating screen coupled with a CCD camera, ionization chamber and a Faraday Cup were used to model the beam in TOPAS without using any machine parameter information but the virtual source distance from the isocenter. Then the model was validated on simple Spread Out Bragg Peaks (SOBP) delivered in water phantom and with six realistic clinical plans (many involving 3 or more fields) on an anthropomorphic phantom. In particular the behavior of the moveable Range Shifter (RS) feature was investigated and its modeling has been proposed. The gamma analysis (3%,3 mm) was used to compare MC, TPS (XiO-ELEKTA) and measured 2D dose distributions (using radiochromic film). The MC modeling proposed here shows good results in the validation phase, both for simple irradiation geometry (SOBP in water) and for modulated treatment fields (on anthropomorphic phantoms). In particular head lesions were investigated and both MC and TPS data were compared with measurements. Treatment plans with no RS always showed a very good agreement with both of them (γ-Passing Rate (PR)  >  95%). Treatment plans in which the RS was needed were also tested and validated. For these treatment plans MC results showed better agreement with measurements (γ-PR  >  93%) than the one coming from TPS (γ-PR  <  88%). This work shows how to simplify the MC modeling of a PBS machine for proton therapy treatments without accounting for any hardware components and proposes a more reliable RS modeling than the one implemented in our TPS. The validation process has shown how this code is a valid candidate for a completely independent treatment plan dose calculation algorithm. This makes the code an important future tool for the patient specific QA verification process. PMID:26501569

  18. Coding tools investigation for next generation video coding based on HEVC

    NASA Astrophysics Data System (ADS)

    Chen, Jianle; Chen, Ying; Karczewicz, Marta; Li, Xiang; Liu, Hongbin; Zhang, Li; Zhao, Xin

    2015-09-01

    The new state-of-the-art video coding standard, H.265/HEVC, has been finalized in 2013 and it achieves roughly 50% bit rate saving compared to its predecessor, H.264/MPEG-4 AVC. This paper provides the evidence that there is still potential for further coding efficiency improvements. A brief overview of HEVC is firstly given in the paper. Then, our improvements on each main module of HEVC are presented. For instance, the recursive quadtree block structure is extended to support larger coding unit and transform unit. The motion information prediction scheme is improved by advanced temporal motion vector prediction, which inherits the motion information of each small block within a large block from a temporal reference picture. Cross component prediction with linear prediction model improves intra prediction and overlapped block motion compensation improves the efficiency of inter prediction. Furthermore, coding of both intra and inter prediction residual is improved by adaptive multiple transform technique. Finally, in addition to deblocking filter and SAO, adaptive loop filter is applied to further enhance the reconstructed picture quality. This paper describes above-mentioned techniques in detail and evaluates their coding performance benefits based on the common test condition during HEVC development. The simulation results show that significant performance improvement over HEVC standard can be achieved, especially for the high resolution video materials.

  19. Investigation of Error Concealment Using Different Transform Codings and Multiple Description Codings

    NASA Astrophysics Data System (ADS)

    Farzamnia, Ali; Syed-Yusof, Sharifah K.; Fisal, Norsheila; Abu-Bakar, Syed A. R.

    2012-05-01

    There has been increasing usage of Multiple Description Coding (MDC) for error concealment in non-ideal channels. A lot of ideas have been masterminded for MDC method up to now. This paper described the attempts to conceal the error and reconstruct the lost descriptions caused by combining MDC and lapped orthogonal transform (LOT). In this work LOT and other transforms codings (DCT and wavelet) are used to decorrelate the image pixels in the transform domain. LOT has better performance at low bit rates in comparison to DCT and wavelet transform. The results show that MSE for the proposed methods in comparison to DCT and wavelet have decreased significantly. The PSNR values of reconstructed images are high. The subjective evaluation of image is very good and clear. Furthermore, the standard deviations of reconstructed images are very small especially in low capacity channels.

  20. Investigating the Language and Literacy Skills Required for Independent Online Learning

    ERIC Educational Resources Information Center

    Silver-Pacuilla, Heidi

    2008-01-01

    This investigation was undertaken to investigate the threshold levels of literacy and language proficiency necessary for adult learners to use the Internet for independent learning. The report is triangulated around learning from large-scale surveys, learning from the literature, and learning from the field. Reported findings include: (1)…

  1. ADF95: Tool for automatic differentiation of a FORTRAN code designed for large numbers of independent variables

    NASA Astrophysics Data System (ADS)

    Straka, Christian W.

    2005-06-01

    ADF95 is a tool to automatically calculate numerical first derivatives for any mathematical expression as a function of user defined independent variables. Accuracy of derivatives is achieved within machine precision. ADF95 may be applied to any FORTRAN 77/90/95 conforming code and requires minimal changes by the user. It provides a new derived data type that holds the value and derivatives and applies forward differencing by overloading all FORTRAN operators and intrinsic functions. An efficient indexing technique leads to a reduced memory usage and a substantially increased performance gain over other available tools with operator overloading. This gain is especially pronounced for sparse systems with large number of independent variables. A wide class of numerical simulations, e.g., those employing implicit solvers, can profit from ADF95. Program summaryTitle of program:ADF95 Catalogue identifier: ADVI Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVI Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer for which the program is designed: all platforms with a FORTRAN 95 compiler Programming language used:FORTRAN 95 No. of lines in distributed program, including test data, etc.: 3103 No. of bytes in distributed program, including test data, etc.: 9862 Distribution format: tar.gz Nature of problem: In many areas in the computational sciences first order partial derivatives for large and complex sets of equations are needed with machine precision accuracy. For example, any implicit or semi-implicit solver requires the computation of the Jacobian matrix, which contains the first derivatives with respect to the independent variables. ADF95 is a software module to facilitate the automatic computation of the first partial derivatives of any arbitrarily complex mathematical FORTRAN expression. The program exploits the sparsity inherited by many set of equations thereby enabling faster computations compared to alternate differentiation tools Solution method: A class is constructed which applies the chain rule of differentiation to any FORTRAN expression, to compute the first derivatives by forward differencing. An efficient indexing technique leads to a reduced memory usage and a substantially increased performance gain when sparsity can be exploited. From a users point of view, only minimal changes to his/her original code are needed in order to compute the first derivatives of any expression in the code Restrictions: Processor and memory hardware may restrict both the possible number of independent variables and the computation time Unusual features:ADF95 can operate on user code that makes use of the array features introduced in FORTRAN 90. A convenient extraction subroutine for the Jacobian matrix is also provided Running time: In many realistic cases, the evaluation of the first order derivatives of a mathematical expression is only six times slower compared to the evaluation of analytically derived and hard-coded expressions. The actual factor depends on the underlying set of equations for which derivatives are to be calculated, the number of independent variables, the sparsity and on the FORTRAN 95 compiler

  2. Independent assessment of TRAC-PD2 and RELAP5/MOD1 codes at BNL in FY 1981. [PWR

    SciTech Connect

    Saha, P; Jo, J H; Neymotin, L; Rohatgi, U S; Slovik, G

    1982-12-01

    This report documents the independent assessment calculations performed with the TRAC-PD2 and RELAP/MOD1 codes at Brookhaven National Laboratory (BNL) during Fiscal Year 1981. A large variety of separate-effects experiments dealing with (1) steady-state and transient critical flow, (2) level swell, (3) flooding and entrainment, (4) steady-state flow boiling, (5) integral economizer once-through steam generator (IEOTSG) performance, (6) bottom reflood, and (7) two-dimensional phase separation of two-phase mixtures were simulated with TRAC-PD2. In addition, the early part of an overcooling transient which occurred at the Rancho Seco nuclear power plant on March 20, 1978 was also computed with an updated version of TRAC-PD2. Three separate-effects tests dealing with (1) transient critical flow, (2) steady-state flow boiling, and (3) IEOTSG performance were also simulated with RELAP5/MOD1 code. Comparisons between the code predictions and the test data are presented.

  3. Investigation of Bandwidth-Efficient Coding and Modulation Techniques

    NASA Technical Reports Server (NTRS)

    Osborne, William P.

    1992-01-01

    The necessary technology was studied to improve the bandwidth efficiency of the space-to-ground communications network using the current capabilities of that network as a baseline. The study was aimed at making space payloads, for example the Hubble Space Telescope, more capable without the need to completely redesign the link. Particular emphasis was placed on the following concepts: (1) what the requirements are which are necessary to convert an existing standard 4-ary phase shift keying communications link to one that can support, as a minimum, 8-ary phase shift keying with error corrections applied; and (2) to determine the feasibility of using the existing equipment configurations with additional signal processing equipment to realize the higher order modulation and coding schemes.

  4. Investigations with methanobacteria and with evolution of the genetic code

    NASA Technical Reports Server (NTRS)

    Jukes, T. H.

    1986-01-01

    Mycoplasma capricolum was found by Osawa et al. to use UGA as the code of tryptophan and to contain 75% A + T in its DNA. This change could have been from evolutionary pressure to replace C + G by A + T. Numerous studies have been reported of evolution of proteins as measured by amino acid replacements that are observed when homologus proteins, such as hemoglobins from various vertebrates, are compared. These replacements result from nucleotide substitutions in amino acid codons in the corresponding genes. Simultaneously, silent nucleotide substitutions take place that can be studied when sequences of the genes are compared. These silent evolutionary changes take place mostly in third positions of codons. Two types of nucleotide substitutions are recognized: pyrimidine-pyrimidine and purine-purine interchanges (transitions) and pyriidine-purine interchanges (transversions). Silent transitions are favored when a corresponding transversion would produce an amino acid replacement. Conversely, silent transversions are favored by probability when transitions and transversions will both be silent. Extensive examples of these situations have been found in protein genes, and it is evident that transversions in silent positions predominate in family boxes in most of the examples studied. In associated research a streptomycete from cow manure was found to produce an extracellular enzyme capable of lysing the pseudomurein-contining methanogen Methanobacterium formicicum.

  5. Approaches to Learning at Work: Investigating Work Motivation, Perceived Workload, and Choice Independence

    ERIC Educational Resources Information Center

    Kyndt, Eva; Raes, Elisabeth; Dochy, Filip; Janssens, Els

    2013-01-01

    Learning and development are taking up a central role in the human resource policies of organizations because of their crucial contribution to the competitiveness of those organizations. The present study investigates the relationship of work motivation, perceived workload, and choice independence with employees' approaches to learning at…

  6. Approaches to Learning at Work: Investigating Work Motivation, Perceived Workload, and Choice Independence

    ERIC Educational Resources Information Center

    Kyndt, Eva; Raes, Elisabeth; Dochy, Filip; Janssens, Els

    2013-01-01

    Learning and development are taking up a central role in the human resource policies of organizations because of their crucial contribution to the competitiveness of those organizations. The present study investigates the relationship of work motivation, perceived workload, and choice independence with employees' approaches to learning at

  7. Modality independence of order coding in working memory: Evidence from cross-modal order interference at recall.

    PubMed

    Vandierendonck, André

    2016-01-01

    Working memory researchers do not agree on whether order in serial recall is encoded by dedicated modality-specific systems or by a more general modality-independent system. Although previous research supports the existence of autonomous modality-specific systems, it has been shown that serial recognition memory is prone to cross-modal order interference by concurrent tasks. The present study used a serial recall task, which was performed in a single-task condition and in a dual-task condition with an embedded memory task in the retention interval. The modality of the serial task was either verbal or visuospatial, and the embedded tasks were in the other modality and required either serial or item recall. Care was taken to avoid modality overlaps during presentation and recall. In Experiment 1, visuospatial but not verbal serial recall was more impaired when the embedded task was an order than when it was an item task. Using a more difficult verbal serial recall task, verbal serial recall was also more impaired by another order recall task in Experiment 2. These findings are consistent with the hypothesis of modality-independent order coding. The implications for views on short-term recall and the multicomponent view of working memory are discussed. PMID:25801664

  8. An early underwater artificial vision model in ocean investigations via independent component analysis.

    PubMed

    Nian, Rui; Liu, Fang; He, Bo

    2013-01-01

    Underwater vision is one of the dominant senses and has shown great prospects in ocean investigations. In this paper, a hierarchical Independent Component Analysis (ICA) framework has been established to explore and understand the functional roles of the higher order statistical structures towards the visual stimulus in the underwater artificial vision system. The model is inspired by characteristics such as the modality, the redundancy reduction, the sparseness and the independence in the early human vision system, which seems to respectively capture the Gabor-like basis functions, the shape contours or the complicated textures in the multiple layer implementations. The simulation results have shown good performance in the effectiveness and the consistence of the approach proposed for the underwater images collected by autonomous underwater vehicles (AUVs). PMID:23863855

  9. An Early Underwater Artificial Vision Model in Ocean Investigations via Independent Component Analysis

    PubMed Central

    Nian, Rui; Liu, Fang; He, Bo

    2013-01-01

    Underwater vision is one of the dominant senses and has shown great prospects in ocean investigations. In this paper, a hierarchical Independent Component Analysis (ICA) framework has been established to explore and understand the functional roles of the higher order statistical structures towards the visual stimulus in the underwater artificial vision system. The model is inspired by characteristics such as the modality, the redundancy reduction, the sparseness and the independence in the early human vision system, which seems to respectively capture the Gabor-like basis functions, the shape contours or the complicated textures in the multiple layer implementations. The simulation results have shown good performance in the effectiveness and the consistence of the approach proposed for the underwater images collected by autonomous underwater vehicles (AUVs). PMID:23863855

  10. Registered report: Coding-independent regulation of the tumor suppressor PTEN by competing endogenous mRNAs.

    PubMed

    Phelps, Mitch; Coss, Chris; Wang, Hongyan; Cook, Matthew

    2016-01-01

    The Reproducibility Project: Cancer Biology seeks to address growing concerns about reproducibility in scientific research by conducting replications of selected experiments from a number of high-profile papers in the field of cancer biology. The papers, which were published between 2010 and 2012, were selected on the basis of citations and Altmetric scores (Errington et al., 2014). This Registered Report describes the proposed replication plan of key experiments from "Coding-Independent Regulation of the Tumor Suppressor PTEN by Competing Endogenous 'mRNAs' by Tay and colleagues, published in Cell in 2011 (Tay et al., 2011). The experiments to be replicated are those reported in Figures 3C, 3D, 3G, 3H, 5A and 5B, and in Supplemental Figures 3A and B. Tay and colleagues proposed a new regulatory mechanism based on competing endogenous RNAs (ceRNAs), which regulate target genes by competitive binding of shared microRNAs. They test their model by identifying and confirming ceRNAs that target PTEN. In Figure 3A and B, they report that perturbing expression of putative PTEN ceRNAs affects expression of PTEN. This effect is dependent on functional microRNA machinery (Figure 3G and H), and affects the pathway downstream of PTEN itself (Figures 5A and B). The Reproducibility Project: Cancer Biology is a collaboration between the Center for Open Science and Science Exchange, and the results of the replications will be published by eLife. PMID:26943900

  11. Investigation of the Use of Erasures in a Concatenated Coding Scheme

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Marriott, Philip J.

    1997-01-01

    A new method for declaring erasures in a concatenated coding scheme is investigated. This method is used with the rate 1/2 K = 7 convolutional code and the (255, 223) Reed Solomon code. Errors and erasures Reed Solomon decoding is used. The erasure method proposed uses a soft output Viterbi algorithm and information provided by decoded Reed Solomon codewords in a deinterleaving frame. The results show that a gain of 0.3 dB is possible using a minimum amount of decoding trials.

  12. Binary neutron-star mergers with Whisky and SACRA: First quantitative comparison of results from independent general-relativistic hydrodynamics codes

    SciTech Connect

    Baiotti, Luca; Shibata, Masaru; Yamamoto, Tetsuro

    2010-09-15

    We present the first quantitative comparison of two independent general-relativistic hydrodynamics codes, the whisky code and the sacra code. We compare the output of simulations starting from the same initial data and carried out with the configuration (numerical methods, grid setup, resolution, gauges) which for each code has been found to give consistent and sufficiently accurate results, in particular, in terms of cleanness of gravitational waveforms. We focus on the quantities that should be conserved during the evolution (rest mass, total mass energy, and total angular momentum) and on the gravitational-wave amplitude and frequency. We find that the results produced by the two codes agree at a reasonable level, with variations in the different quantities but always at better than about 10%.

  13. Binary neutron-star mergers with Whisky and SACRA: First quantitative comparison of results from independent general-relativistic hydrodynamics codes

    NASA Astrophysics Data System (ADS)

    Baiotti, Luca; Shibata, Masaru; Yamamoto, Tetsuro

    2010-09-01

    We present the first quantitative comparison of two independent general-relativistic hydrodynamics codes, the whisky code and the sacra code. We compare the output of simulations starting from the same initial data and carried out with the configuration (numerical methods, grid setup, resolution, gauges) which for each code has been found to give consistent and sufficiently accurate results, in particular, in terms of cleanness of gravitational waveforms. We focus on the quantities that should be conserved during the evolution (rest mass, total mass energy, and total angular momentum) and on the gravitational-wave amplitude and frequency. We find that the results produced by the two codes agree at a reasonable level, with variations in the different quantities but always at better than about 10%.

  14. A computational fluid dynamics code for the investigation of ramjet-in-tube concepts

    NASA Astrophysics Data System (ADS)

    Bogdanoff, D. W.; Brackett, D. C.

    1987-06-01

    An inviscid computational fluid dynamics (CFD) code is presented which can handle multiple component species, simple chemical reactions, a completely general equation of state and velocities up to hundreds of km/sec. The code can also handle mutilple moving zones containing different media. Radiation effects are not included. The code uses third order spatial extrapolation/interpolation of the primitive variables to determine cell boundary values, applies limiting procedures to these values to maintain code stability and accuracy, and then uses Godunov procedures to calculate the cell boundary fluxes. The code numerical methods are presented in some detail and the results of benchmark test cases used to proof out the code are given. The agreement between the CFD and exact analytical calculations is found to be excellent. The code is used to investigate a ramjet-in-tube concept. In this concept, a projectile flies down a tube filled with combustible gas mixtures. The mixtures studied are H2 plus O2 plus excess H2 or N2 or CO2 as diluent. The projectile velocity range is 4 to 10 km/sec. Efficiencies up to 0.26 and ratios of effective projectile thrust pressure to maximum cycle pressure up to 0.12 are obtained. Plots of the pressure field around the projectile are presented.

  15. Registered report: Coding-independent regulation of the tumor suppressor PTEN by competing endogenous mRNAs

    PubMed Central

    Phelps, Mitch; Coss, Chris; Wang, Hongyan; Cook, Matthew

    2016-01-01

    The Reproducibility Project: Cancer Biology seeks to address growing concerns about reproducibility in scientific research by conducting replications of selected experiments from a number of high-profile papers in the field of cancer biology. The papers, which were published between 2010 and 2012, were selected on the basis of citations and Altmetric scores (Errington et al., 2014). This Registered Report describes the proposed replication plan of key experiments from “Coding-Independent Regulation of the Tumor Suppressor PTEN by Competing Endogenous 'mRNAs' by Tay and colleagues, published in Cell in 2011 (Tay et al., 2011). The experiments to be replicated are those reported in Figures 3C, 3D, 3G, 3H, 5A and 5B, and in Supplemental Figures 3A and B. Tay and colleagues proposed a new regulatory mechanism based on competing endogenous RNAs (ceRNAs), which regulate target genes by competitive binding of shared microRNAs. They test their model by identifying and confirming ceRNAs that target PTEN. In Figure 3A and B, they report that perturbing expression of putative PTEN ceRNAs affects expression of PTEN. This effect is dependent on functional microRNA machinery (Figure 3G and H), and affects the pathway downstream of PTEN itself (Figures 5A and B). The Reproducibility Project: Cancer Biology is a collaboration between the Center for Open Science and Science Exchange, and the results of the replications will be published by eLife. DOI: http://dx.doi.org/10.7554/eLife.12470.001 PMID:26943900

  16. Final report of the independent counsel for Iran/Contra matters. Volume 1: Investigations and prosecutions

    SciTech Connect

    Walsh, L.E.

    1993-08-04

    In October and November 1986, two secret U.S. Government operations were publicly exposed, potentially implicating Reagan Administration officials in illegal activities. These operations were the provision of assistance to the military activities of the Nicaraguan contra rebels during an October 1984 to October 1986 prohibition on such aid, and the sale of U.S. arms to Iran in contravention of stated U.S. policy and in possible violation of arms-export controls. In late November 1986, Reagan Administration officials announced that some of the proceeds from the sale of U.S. arms to Iran had been diverted to the contras. As a result of the exposure of these operations, Attorney General Edwin Meese III sought the appointment of an independent counsel to investigate and, if necessary, prosecute possible crimes arising from them. This is the final report of that investigation.

  17. A Coding System with Independent Annotations of Gesture Forms and Functions during Verbal Communication: Development of a Database of Speech and GEsture (DoSaGE)

    PubMed Central

    Kong, Anthony Pak-Hin; Law, Sam-Po; Kwan, Connie Ching-Yin; Lai, Christy; Lam, Vivian

    2014-01-01

    Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies in the fact that the coding of forms and functions of gestures has not been clearly differentiated. This paper first described a recently developed Database of Speech and GEsture (DoSaGE) based on independent annotation of gesture forms and functions among 119 neurologically unimpaired right-handed native speakers of Cantonese (divided into three age and two education levels), and presented findings of an investigation examining how gesture use was related to age and linguistic performance. Consideration of these two factors, for which normative data are currently very limited or lacking in the literature, is relevant and necessary when one evaluates gesture employment among individuals with and without language impairment. Three speech tasks, including monologue of a personally important event, sequential description, and story-telling, were used for elicitation. The EUDICO Linguistic ANnotator (ELAN) software was used to independently annotate each participant’s linguistic information of the transcript, forms of gestures used, and the function for each gesture. About one-third of the subjects did not use any co-verbal gestures. While the majority of gestures were non-content-carrying, which functioned mainly for reinforcing speech intonation or controlling speech flow, the content-carrying ones were used to enhance speech content. Furthermore, individuals who are younger or linguistically more proficient tended to use fewer gestures, suggesting that normal speakers gesture differently as a function of age and linguistic performance. PMID:25667563

  18. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectic materials

    NASA Astrophysics Data System (ADS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-07-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional numerical electromagnetic scattering codes based upon the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current two dimensional FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem set section, a new problem checklist, references and figure titles.

  19. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectric materials

    NASA Astrophysics Data System (ADS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-11-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional electromagnetic scattering codes based on the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  20. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional numerical electromagnetic scattering codes based upon the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current two dimensional FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem set section, a new problem checklist, references and figure titles.

  1. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  2. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three-dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain (FDTD) technique. The supplied version of the code is one version of our current three-dimensional FDTD code set. The manual given here provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing radar cross section computations, a section discussing some scattering results, a new problem checklist, references, and figure titles.

  3. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional electromagnetic scattering codes based on the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  4. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain (FDTD) Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain technique. The supplied version of the code is one version of our current three dimensional FDTD code set. The manual provides a description of the code and the corresponding results for the default scattering problem. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version A code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONA.FOR), a section briefly discussing radar cross section (RCS) computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  5. Your ticket to independence: a guide to getting your first principal investigator position.

    PubMed

    Káradóttir, Ragnhildur Thóra; Letzkus, Johannes J; Mameli, Manuel; Ribeiro, Carlos

    2015-10-01

    The transition to scientific independence as a principal investigator (PI) can seem like a daunting and mysterious process to postdocs and students - something that many aspire to while at the same time wondering how to achieve this goal and what being a PI really entails. The FENS Kavli Network of Excellence (FKNE) is a group of young faculty who have recently completed this step in various fields of neuroscience across Europe. In a series of opinion pieces from FKNE scholars, we aim to demystify this process and to offer the next generation of up-and-coming PIs some advice and personal perspectives on the transition to independence, starting here with guidance on how to get hired to your first PI position. Rather than providing an exhaustive overview of all facets of the hiring process, we focus on a few key aspects that we have learned to appreciate in the quest for our own labs: What makes a research programme exciting and successful? How can you identify great places to apply to and make sure your application stands out? What are the key objectives for the job talk and the interview? How do you negotiate your position? And finally, how do you decide on a host institute that lets you develop both scientifically and personally in your new role as head of a lab? PMID:26286226

  6. 78 FR 37571 - Certain Opaque Polymers; Institution of Investigation Pursuant to United States Code

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ... COMMISSION Certain Opaque Polymers; Institution of Investigation Pursuant to United States Code AGENCY: U.S... importation, and the sale within the United States after importation of certain opaque polymers by reason of... importation, or the sale within the United States after importation of certain opaque polymers that...

  7. Code-Switching in Iranian Elementary EFL Classrooms: An Exploratory Investigation

    ERIC Educational Resources Information Center

    Rezvani, Ehsan; Street, Hezar Jerib; Rasekh, Abbass Eslami

    2011-01-01

    This paper presents the results of a small-scale exploratory investigation of code-switching (CS) between English and Farsi by 4 Iranian English foreign language (EFL) teachers in elementary level EFL classrooms in a language school in Isfahan, Iran. Specifically, the present study aimed at exploring the syntactical identification of switches and…

  8. A model to investigate the mechanisms underlying the emergence and development of independent sitting.

    PubMed

    O'Brien, Kathleen M; Zhang, Jing; Walley, Philip R; Rhoads, Jeffrey F; Haddad, Jeffrey M; Claxton, Laura J

    2015-07-01

    When infants first begin to sit independently, they are highly unstable and unable to maintain upright sitting posture for more than a few seconds. Over the course of 3 months, the sitting ability of infants drastically improves. To investigate the mechanisms controlling the development of sitting posture, a single-degree-of-freedom inverted pendulum model was developed. Passive muscle properties were modeled with a stiffness and damping term, while active neurological control was modeled with a time-delayed proportional-integral-derivative (PID) controller. The findings of the simulations suggest that infants primarily utilize passive muscle stiffness to remain upright when they first begin to sit. This passive control mechanism allows the infant to remain upright so that active feedback control mechanisms can develop. The emergence of active control mechanisms allows infants to integrate sensory information into their movements so that they can exhibit more adaptive sitting. PMID:25442426

  9. Semantic association investigated with fMRI and independent component analysis

    PubMed Central

    Kim, Kwang Ki; Karunanayaka, Prasanna; Privitera, Michael D.; Holland, Scott K.; Szaflarski, Jerzy P.

    2010-01-01

    Semantic association, an essential element of human language, enables discourse and inference. Neuroimaging studies have revealed localization and lateralization of semantic circuitry making substantial contributions to cognitive neuroscience. However, due to methodological limitations, these investigations have only identified individual functional components rather than capturing the behavior of the entire network. To overcome these limitations, we have implemented group independent component analysis (ICA) to investigate the cognitive modules used by healthy adults performing fMRI semantic decision task. When compared to the results of a standard GLM analysis, ICA detected several additional brain regions subserving semantic decision. Eight task-related group ICA maps were identified including left inferior frontal gyrus (BA44/45), middle posterior temporal gyrus (BA39/22), angular gyrus/inferior parietal lobule (BA39/40), posterior cingulate (BA30), bilateral lingual gyrus (BA18/23), inferior frontal gyrus (L>R, BA47), hippocampus with parahippocampal gyrus (L>R, BA35/36) and anterior cingulate (BA32/24). While most of the components were represented bilaterally, we found a single, highly left-lateralized component that included the inferior frontal gyrus and the medial and superior temporal gyri, the angular and supramarginal gyri and the inferior parietal cortex. The presence of these spatially independent ICA components implies functional connectivity and can be equated with their modularity. These results are analyzed and presented in the framework of a biologically plausible theoretical model in preparation for similar analyses in patients with right- or left-hemispheric epilepsies. PMID:21296027

  10. Investigating the Magnetorotational Instability with Dedalus, and Open-Souce Hydrodynamics Code

    SciTech Connect

    Burns, Keaton J; /UC, Berkeley, aff SLAC

    2012-08-31

    The magnetorotational instability is a fluid instability that causes the onset of turbulence in discs with poloidal magnetic fields. It is believed to be an important mechanism in the physics of accretion discs, namely in its ability to transport angular momentum outward. A similar instability arising in systems with a helical magnetic field may be easier to produce in laboratory experiments using liquid sodium, but the applicability of this phenomenon to astrophysical discs is unclear. To explore and compare the properties of these standard and helical magnetorotational instabilities (MRI and HRMI, respectively), magnetohydrodynamic (MHD) capabilities were added to Dedalus, an open-source hydrodynamics simulator. Dedalus is a Python-based pseudospectral code that uses external libraries and parallelization with the goal of achieving speeds competitive with codes implemented in lower-level languages. This paper will outline the MHD equations as implemented in Dedalus, the steps taken to improve the performance of the code, and the status of MRI investigations using Dedalus.

  11. Investigation of Coded Source Neutron Imaging at the North Carolina State University PULSTAR Reactor

    SciTech Connect

    Xiao, Ziyu; Mishra, Kaushal; Hawari, Ayman; Bingham, Philip R; Bilheux, Hassina Z; Tobin Jr, Kenneth William

    2010-01-01

    A neutron imaging facility is located on beam-tube #5 of the 1-MWth PULSTAR reactor at the North Carolina State University. An investigation has been initiated to explore the application of coded imaging techniques at the facility. Coded imaging uses a mosaic of pinholes to encode an aperture, thus generating an encoded image of the object at the detector. To reconstruct the image recorded by the detector, corresponding decoding patterns are used. The optimized design of coded masks is critical for the performance of this technique and will depend on the characteristics of the imaging beam. In this work, Monte Carlo (MCNP) simulations were utilized to explore the needed modifications to the PULSTAR thermal neutron beam to support coded imaging techniques. In addition, an assessment of coded mask design has been performed. The simulations indicated that a 12 inch single crystal sapphire filter is suited for such an application at the PULSTAR beam in terms of maximizing flux with good neutron-to-gamma ratio. Computational simulations demonstrate the feasibility of correlation reconstruction methods on neutron transmission imaging. A gadolinium aperture with thickness of 500 m was used to construct the mask using a 38 34 URA pattern. A test experiment using such a URA design has been conducted and the point spread function of the system has been measured.

  12. Coding for stable transmission of W-band radio-over-fiber system using direct-beating of two independent lasers.

    PubMed

    Yang, L G; Sung, J Y; Chow, C W; Yeh, C H; Cheng, K T; Shi, J W; Pan, C L

    2014-10-20

    We demonstrate experimentally Manchester (MC) coding based W-band (75 - 110 GHz) radio-over-fiber (ROF) system to reduce the low-frequency-components (LFCs) signal distortion generated by two independent low-cost lasers using spectral shaping. Hence, a low-cost and higher performance W-band ROF system is achieved. In this system, direct-beating of two independent low-cost CW lasers without frequency tracking circuit (FTC) is used to generate the millimeter-wave. Approaches, such as delayed self-heterodyne interferometer and heterodyne beating are performed to characterize the optical-beating-interference sub-terahertz signal (OBIS). Furthermore, W-band ROF systems using MC coding and NRZ-OOK are compared and discussed. PMID:25401641

  13. ALS beamlines for independent investigators: A summary of the capabilities and characteristics of beamlines at the ALS

    SciTech Connect

    Not Available

    1992-08-01

    There are two mods of conducting research at the ALS: To work as a member of a participating research team (PRT). To work as a member of a participating research team (PRT); to work as an independent investigator; PRTs are responsible for building beamlines, end stations, and, in some cases, insertion devices. Thus, PRT members have privileged access to the ALS. Independent investigators will use beamline facilities made available by PRTs. The purpose of this handbook is to describe these facilities.

  14. RACE, CODE OF THE STREET, AND VIOLENT DELINQUENCY: A MULTILEVEL INVESTIGATION OF NEIGHBORHOOD STREET CULTURE AND INDIVIDUAL NORMS OF VIOLENCE*

    PubMed Central

    Stewart, Eric A.; Simons, Ronald L.

    2011-01-01

    The study outlined in this article drew on Elijah Anderson’s (1999) code of the street perspective to examine the impact of neighborhood street culture on violent delinquency. Using data from more than 700 African American adolescents, we examined 1) whether neighborhood street culture predicts adolescent violence above and beyond an adolescent’s own street code values and 2) whether neighborhood street culture moderates individual-level street code values on adolescent violence. Consistent with Anderson’s hypotheses, neighborhood street culture significantly predicts violent delinquency independent of individual-level street code effects. Additionally, neighborhood street culture moderates individual-level street code values on violence in neighborhoods where the street culture is widespread. In particular, the effect of street code values on violence is enhanced in neighborhoods where the street culture is endorsed widely. PMID:21666759

  15. Flight Investigation of Prescribed Simultaneous Independent Surface Excitations for Real-Time Parameter Identification

    NASA Technical Reports Server (NTRS)

    Moes, Timothy R.; Smith, Mark S.; Morelli, Eugene A.

    2003-01-01

    Near real-time stability and control derivative extraction is required to support flight demonstration of Intelligent Flight Control System (IFCS) concepts being developed by NASA, academia, and industry. Traditionally, flight maneuvers would be designed and flown to obtain stability and control derivative estimates using a postflight analysis technique. The goal of the IFCS concept is to be able to modify the control laws in real time for an aircraft that has been damaged in flight. In some IFCS implementations, real-time parameter identification (PID) of the stability and control derivatives of the damaged aircraft is necessary for successfully reconfiguring the control system. This report investigates the usefulness of Prescribed Simultaneous Independent Surface Excitations (PreSISE) to provide data for rapidly obtaining estimates of the stability and control derivatives. Flight test data were analyzed using both equation-error and output-error PID techniques. The equation-error PID technique is known as Fourier Transform Regression (FTR) and is a frequency-domain real-time implementation. Selected results were compared with a time-domain output-error technique. The real-time equation-error technique combined with the PreSISE maneuvers provided excellent derivative estimation in the longitudinal axis. However, the PreSISE maneuvers as presently defined were not adequate for accurate estimation of the lateral-directional derivatives.

  16. Investigation of blood mRNA biomarkers for suicidality in an independent sample

    PubMed Central

    Mullins, N; Hodgson, K; Tansey, K E; Perroud, N; Maier, W; Mors, O; Rietschel, M; Hauser, J; Henigsberg, N; Souery, D; Aitchison, K; Farmer, A; McGuffin, P; Breen, G; Uher, R; Lewis, C M

    2014-01-01

    Changes in the blood expression levels of SAT1, PTEN, MAP3K3 and MARCKS genes have been reported as biomarkers of high versus low suicidality state (Le-Niculescu et al.). Here, we investigate these expression biomarkers in the Genome-Based Therapeutic Drugs for Depression (GENDEP) study, of patients with major depressive disorder on a 12-week antidepressant treatment. Blood gene expression levels were available at baseline and week 8 for patients who experienced suicidal ideation during the study (n=20) versus those who did not (n=37). The analysis is well powered to detect the effect sizes reported in the original paper. Within either group, there was no significant change in the expression of these four genes over the course of the study, despite increasing suicidal ideation or initiation of antidepressant treatment. Comparison of the groups showed that the gene expression did not differ between patients with or without treatment-related suicidality. This independent study does not support the validity of the proposed biomarkers. PMID:25350297

  17. Detailed investigation of Long-Period activity at Campi Flegrei by Convolutive Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Capuano, P.; De Lauro, E.; De Martino, S.; Falanga, M.

    2016-04-01

    This work is devoted to the analysis of seismic signals continuously recorded at Campi Flegrei Caldera (Italy) during the entire year 2006. The radiation pattern associated with the Long-Period energy release is investigated. We adopt an innovative Independent Component Analysis algorithm for convolutive seismic series adapted and improved to give automatic procedures for detecting seismic events often buried in the high-level ambient noise. The extracted waveforms characterized by an improved signal-to-noise ratio allows the recognition of Long-Period precursors, evidencing that the seismic activity accompanying the mini-uplift crisis (in 2006), which climaxed in the three days from 26-28 October, had already started at the beginning of the month of October and lasted until mid of November. Hence, a more complete seismic catalog is then provided which can be used to properly quantify the seismic energy release. To better ground our results, we first check the robustness of the method by comparing it with other blind source separation methods based on higher order statistics; secondly, we reconstruct the radiation patterns of the extracted Long-Period events in order to link the individuated signals directly to the sources. We take advantage from Convolutive Independent Component Analysis that provides basic signals along the three directions of motion so that a direct polarization analysis can be performed with no other filtering procedures. We show that the extracted signals are mainly composed of P waves with radial polarization pointing to the seismic source of the main LP swarm, i.e. a small area in the Solfatara, also in the case of the small-events, that both precede and follow the main activity. From a dynamical point of view, they can be described by two degrees of freedom, indicating a low-level of complexity associated with the vibrations from a superficial hydrothermal system. Our results allow us to move towards a full description of the complexity of the source, which can be used, by means of the small-intensity precursors, for hazard-model development and forecast-model testing, showing an illustrative example of the applicability of the CICA method to regions with low seismicity in high ambient noise.

  18. An investigation on the capabilities of the PENELOPE MC code in nanodosimetry

    SciTech Connect

    Bernal, M. A.; Liendo, J. A.

    2009-02-15

    The Monte Carlo (MC) method has been widely implemented in studies of radiation effects on human genetic material. Most of these works have used specific-purpose MC codes to simulate radiation transport in condensed media. PENELOPE is one of the general-purpose MC codes that has been used in many applications related to radiation dosimetry. Based on the fact that PENELOPE can carry out event-by-event coupled electron-photon transport simulations following these particles down to energies of the order of few tens of eV, we have decided to investigate the capacities of this code in the field of nanodosimetry. Single and double strand break probabilities due to the direct impact of {gamma} rays originated from Co{sup 60} and Cs{sup 137} isotopes and characteristic x-rays, from Al and C K-shells, have been determined by use of PENELOPE. Indirect damage has not been accounted for in this study. A human genetic material geometrical model has been developed, taking into account five organizational levels. In an article by Friedland et al. [Radiat. Environ. Biophys. 38, 39-47 (1999)], a specific-purpose MC code and a very sophisticated DNA geometrical model were used. We have chosen that work as a reference to compare our results. Single and double strand-break probabilities obtained here underestimate those reported by Friedland and co-workers by 20%-76% and 50%-60%, respectively. However, we obtain RBE values for Cs{sup 137}, Al{sub K} and C{sub K} radiations in agreement with those reported in previous works [Radiat. Environ. Biophys. 38, 39-47 (1999)] and [Phys. Med. Biol. 53, 233-244 (2008)]. Some enhancements can be incorporated into the PENELOPE code to improve its results in the nanodosimetry field.

  19. An investigation of design optimization using a 2-D viscous flow code with multigrid

    NASA Technical Reports Server (NTRS)

    Doria, Michael L.

    1990-01-01

    Computational fluid dynamics (CFD) codes have advanced to the point where they are effective analytical tools for solving flow fields around complex geometries. There is also a need for their use as a design tool to find optimum aerodynamic shapes. In the area of design, however, a difficulty arises due to the large amount of computer resources required by these codes. It is desired to streamline the design process so that a large number of design options and constraints can be investigated without overloading the system. There are several techniques which have been proposed to help streamline the design process. The feasibility of one of these techniques is investigated. The technique under consideration is the interaction of the geometry change with the flow calculation. The problem of finding the value of camber which maximizes the ratio of lift over drag for a particular airfoil is considered. In order to test out this technique, a particular optimization problem was tried. A NACA 0012 airfoil was considered at free stream Mach number of 0.5 with a zero angle of attack. Camber was added to the mean line of the airfoil. The goal was to find the value of camber for which the ratio of lift over drag is a maximum. The flow code used was FLOMGE which is a two dimensional viscous flow solver which uses multigrid to speed up convergence. A hyperbolic grid generation program was used to construct the grid for each value of camber.

  20. Geochemical and isotopic investigations on groundwater residence time and flow in the Independence Basin, Mexico

    NASA Astrophysics Data System (ADS)

    Mahlknecht, J.; Gárfias-Solis, J.; Aravena, R.; Tesch, R.

    2006-06-01

    The Independence Basin in the semi-arid Guanajuato state of central Mexico is facing serious groundwater resources deficiency due to an increasing demand linked to a rapid population growth and agricultural development. This problem is aggravated by an inadequate evaluation of groundwater resources in the region. Geochemistry and isotopic tracers were used in order to investigate the groundwater flow system and estimate the groundwater residence time. The groundwater is characterized by low salinity with some exceptions associated to a contribution of more saline groundwater from deep formations. The predominant reactions are CO 2 gas dissolution, carbonate dissolution, albite weathering, kaolinite and chalcedony precipitation. Six principal hydrochemical zones were recognized, which provided information on plausible recharge sources and groundwater chemical evolution. The 14C concentration varies between 19 and 94 pmc. The high 14C values indicating recent recharge are observed at the basin margins and a trend to lower 14C values is observed along the modern groundwater flow paths. The groundwater residence time according to radiocarbon estimations ranges between recent and ˜11 ka. The residence time distribution matches the regional important discharge zones west in the basin center (from Dolores Hidalgo and southwest from Doctor Mora). Hydrochemical tracers are in general agreement with the predeveloped and current hydraulic-head configuration, however, show some inconsistencies with the predeveloped head in the downgradient areas, which means that the impact by gradually increasing groundwater extraction during the last decades is reflected on radiocarbon age distribution. Geochemical evidences imply that the recharge input from the northern basin area is insignificant.

  1. Investigation of in-band transmission of both spectral amplitude coding/optical code division multiple-access and wavelength division multiplexing signals

    NASA Astrophysics Data System (ADS)

    Ashour, Isaac A. M.; Shaari, Sahbudin; Shalaby, Hossam M. H.; Menon, P. Susthitha

    2011-06-01

    The transmission of both optical code division multiple-access (OCDMA) and wavelength division multiplexing (WDM) users on the same band is investigated. Code pulses of spectral amplitude coding (SAC)/optical code division multiple-access (CDMA) are overlaid onto a multichannel WDM system. Notch filters are utilized in order to suppress the WDM interference signals for detection of optical broadband CDMA signals. Modified quadratic congruence (MQC) codes are used as the signature codes for the SAC/OCDMA system. The proposed system is simulated and its performance in terms of both the bit-error rate and Q-factor are determined. In addition, eavesdropper probability of error-free code detection is evaluated. Our results are compared to traditional nonhybrid systems. It is concluded that the proposed hybrid scheme still achieves acceptable performance. In addition, it provides enhanced data confidentiality as compared to the scheme with SAC/OCDMA only. It is also shown that the performance of the proposed system is limited by the interference of the WDM signals. Furthermore, the simulation illustrates the tradeoff between the performance and confidentiality for authorized users.

  2. Investigation of Cool and Hot Executive Function in ODD/CD Independently of ADHD

    ERIC Educational Resources Information Center

    Hobson, Christopher W.; Scott, Stephen; Rubia, Katya

    2011-01-01

    Background: Children with oppositional defiant disorder/conduct disorder (ODD/CD) have shown deficits in "cool" abstract-cognitive, and "hot" reward-related executive function (EF) tasks. However, it is currently unclear to what extent ODD/CD is associated with neuropsychological deficits, independently of attention deficit hyperactivity disorder

  3. An Investigation of Independent Child Behavior in the Open Classroom: The Classroom Attitude Observation Schedule (CAOS).

    ERIC Educational Resources Information Center

    Goldupp, Ocea

    The Classroom Attitude Observation Schedule was developed and field tested for study of independent child behavior in the open classroom. Eight Head Start classrooms were used for field testing, six of which used the Tucson Early Education Model curriculum and two of which, for comparison, used local curricula. Procedures involved observing and…

  4. A Longitudinal Investigation of Field Dependence-Independence and the Development of Formal Operational Thought.

    ERIC Educational Resources Information Center

    Flexer, B.K.; Roberge, J.J.

    1983-01-01

    A longitudinal study among American adolescents revealed (1) an insignificant impact of field dependence-independence on the development of formal operational thought; (2) continuous development of combinatorial reasoning and propositional logic abilities, but little increase in comprehension of proportionality; and (3) sex differences in formal…

  5. After a Long-Term Placement: Investigating Educational Achievement, Behaviour, and Transition to Independent Living

    ERIC Educational Resources Information Center

    Dumaret, Annick-Camille; Donati, Pascale; Crost, Monique

    2011-01-01

    This study describes the transition towards independent living of 123 former fostered young people reared for long periods in a private French organisation, SOS Children's Villages. Three generations of care leavers were analysed through a postal survey and interviews. Their narratives show typical pathways after leaving care. Two-thirds became…

  6. Investigation of Cool and Hot Executive Function in ODD/CD Independently of ADHD

    ERIC Educational Resources Information Center

    Hobson, Christopher W.; Scott, Stephen; Rubia, Katya

    2011-01-01

    Background: Children with oppositional defiant disorder/conduct disorder (ODD/CD) have shown deficits in "cool" abstract-cognitive, and "hot" reward-related executive function (EF) tasks. However, it is currently unclear to what extent ODD/CD is associated with neuropsychological deficits, independently of attention deficit hyperactivity disorder…

  7. Investigation of Inconsistent ENDF/B-VII.1 Independent and Cumulative Fission Product Yields with Proposed Revisions

    SciTech Connect

    Pigni, M.T. Francis, M.W.; Gauld, I.C.

    2015-01-15

    A recent implementation of ENDF/B-VII.1 independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear schemes in the decay sub-library that are not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that do not agree with the cumulative fission yields in the library as well as with experimental measurements. To address these issues, a comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron-induced fission for {sup 235,238}U and {sup 239,241}Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to compare the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. Another important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library for stable and long-lived fission products. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.

  8. Investigation of Inconsistent ENDF/B-VII.1 Independent and Cumulative Fission Product Yields with Proposed Revisions

    NASA Astrophysics Data System (ADS)

    Pigni, M. T.; Francis, M. W.; Gauld, I. C.

    2015-01-01

    A recent implementation of ENDF/B-VII.1 independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear schemes in the decay sub-library that are not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that do not agree with the cumulative fission yields in the library as well as with experimental measurements. To address these issues, a comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron-induced fission for 235,238U and 239,241Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to compare the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. Another important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library for stable and long-lived fission products. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.

  9. Investigation of inconsistent ENDF/B-VII.1 independent and cumulative fission product yields with proposed revisions

    SciTech Connect

    Pigni, Marco T; Francis, Matthew W; Gauld, Ian C

    2015-01-01

    A recent implementation of ENDF/B-VII. independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear scheme in the decay sub-library that is not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that are incompatible with the cumulative fission yields in the library, and also with experimental measurements. A comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron induced fission for 235,238U and 239,241Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to evaluate the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. An important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library in the case of stable and long-lived cumulative yields due to the inconsistency of ENDF/B-VII.1 fission p;roduct yield and decay data sub-libraries. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.

  10. Investigation on series of length of coding and non-coding DNA sequences of bacteria using multifractal detrended cross-correlation analysis.

    PubMed

    Stan, Cristina; Cristescu, Monica Teodora; Luiza, Buimaga Iarinca; Cristescu, C P

    2013-03-21

    In the framework of multifractal detrended cross-correlation analysis, we investigate characteristics of series of length of coding and non-coding DNA sequences of some bacteria and archaea. We propose the use of a multifractal cross-correlation series that can be defined for any pair of equal lengths data sequences (or time series) and that can be characterized by the full set of parameters that are attributed to any time series. Comparison between characteristics of series of length of coding and non-coding DNA sequences and of their associated multifractal cross-correlation series for selected groups is used for the identification of class affiliation of certain bacteria and archaea. The analysis is carried out using the dependence of the generalized Hurst exponent on the size of fluctuations, the shape of the singularity spectra, the shape and relative disposition of the curves of the singular measures scaling exponent and the values of the associated parameters. Empirically, we demonstrate that the series of lengths of coding and non-coding sequences as well as the associated multifractal cross-correlation series can be approximated as universal multifractals. PMID:23313335

  11. Investigation of NOTCH4 coding region polymorphisms in sporadic inclusion body myositis.

    PubMed

    Scott, Adrian P; Laing, Nigel G; Mastaglia, Frank; Dalakas, Marinos; Needham, Merrilee; Allcock, Richard J N

    2012-09-15

    The NOTCH4 gene, located within the MHC region, is involved in cellular differentiation and has varying effects dependent on tissue type. Coding region polymorphisms haplotypic of the sIBM-associated 8.1 ancestral haplotype were identified in NOTCH4 and genotyped in two different Caucasian sIBM cohorts. In both cohorts the frequency of the minor allele of rs422951 and the 12-repeat variation for rs72555375 was increased and was higher than the frequency of the sIBM-associated allele HLA-DRB1*0301. These NOTCH4 polymorphisms can be considered to be markers for sIBM susceptibility, but require further investigation to determine whether they are directly involved in the disease pathogenesis. PMID:22732452

  12. Culture-Dependent and -Independent Methods To Investigate the Microbial Ecology of Italian Fermented Sausages

    PubMed Central

    Rantsiou, Kalliopi; Urso, Rosalinda; Iacumin, Lucilla; Cantoni, Carlo; Cattaneo, Patrizia; Comi, Giuseppe; Cocolin, Luca

    2005-01-01

    In this study, the microbial ecology of three naturally fermented sausages produced in northeast Italy was studied by culture-dependent and -independent methods. By plating analysis, the predominance of lactic acid bacteria populations was pointed out, as well as the importance of coagulase-negative cocci. Also in the case of one fermentation, the fecal enterocci reached significant counts, highlighting their contribution to the particular transformation process. Yeast counts were higher than the detection limit (>100 CFU/g) in only one fermented sausage. Analysis of the denaturing gradient gel electrophoresis (DGGE) patterns and sequencing of the bands allowed profiling of the microbial populations present in the sausages during fermentation. The bacterial ecology was mainly characterized by the stable presence of Lactobacillus curvatus and Lactobacillus sakei, but Lactobacillus paracasei was also repeatedly detected. An important piece of evidence was the presence of Lactococcus garvieae, which clearly contributed in two fermentations. Several species of Staphylococcus were also detected. Regarding other bacterial groups, Bacillus sp., Ruminococcus sp., and Macrococcus caseolyticus were also identified at the beginning of the transformations. In addition, yeast species belonging to Debaryomyces hansenii, several Candida species, and Willopsis saturnus were observed in the DGGE gels. Finally, cluster analysis of the bacterial and yeast DGGE profiles highlighted the uniqueness of the fermentation processes studied. PMID:15812029

  13. Nye County Nuclear Waste Repository Project Office independent scientific investigations program annual report, May 1997--April 1998

    SciTech Connect

    1998-07-01

    This annual summary report, prepared by the Nye County Nuclear Waste Repository Project Office (NWRPO), summarizes the activities that were performed during the period from May 1, 1997 to April 30, 1998. These activities were conducted in support of the Independent Scientific Investigation Program (ISIP) of Nye County at the Yucca Mountain Site (YMS). The Nye County NWRPO is responsible for protecting the health and safety of the Nye County residents. NWRPO`s on-site representative is responsible for designing and implementing the Independent Scientific Investigation Program (ISIP). Major objectives of the ISIP include: Investigating key issues related to conceptual design and performance of the repository that can have major impact on human health, safety, and the environment; identifying areas not being addressed adequately by the Department of Energy (DOE). Nye County has identified several key scientific issues of concern that may affect repository design and performance which were not being adequately addressed by DOE. Nye County has been conducting its own independent study to evaluate the significance of these issues. This report summarizes the results of monitoring from two boreholes and the Exploratory Studies Facility (ESF) tunnel that have been instrumented by Nye County since March and April of 1995. The preliminary data and interpretations presented in this report do not constitute and should not be considered as the official position of Nye County. The ISIP presently includes borehole and tunnel instrumentation, monitoring, data analysis, and numerical modeling activities to address the concerns of Nye County.

  14. Nye County nuclear waste repository project office independent scientific investigations program. Summary annual report, May 1996--April 1997

    SciTech Connect

    1997-05-01

    This annual summary report, prepared by Multimedia Environmental Technology, Inc. (MET) on behalf of Nye County Nuclear Waste Project Office, summarizes the activities that were performed during the period from May 1, 1996 to April 30, 1997. These activities were conducted in support of the Independent Scientific Investigation Program (ISIP) of Nye County at the Yucca Mountain Site (YMS). The Nye County NWRPO is responsible for protecting the health and safety of the Nye County residents. NWRPO`s on-site representative is responsible for designing and implementing the Independent Scientific Investigation Program (ISIP). Major objectives of the ISIP include: (1) Investigating key issues related to conceptual design and performance of the repository that can have major impact on human health, safety, and the environment. (2) Identifying areas not being addressed adequately by DOE Nye County has identified several key scientific issues of concern that may affect repository design and performance which were not being adequately addressed by DOE. Nye County has been conducting its own independent study to evaluate the significance of these issues.

  15. A Monte Carlo Investigation of the Analysis of Variance Applied to Non-Independent Bernoulli Variates.

    ERIC Educational Resources Information Center

    Draper, John F., Jr.

    The applicability of the Analysis of Variance, ANOVA, procedures to the analysis of dichotomous repeated measure data is described. The design models for which data were simulated in this investigation were chosen to represent simple cases of two experimental situations: situation one, in which subjects' responses to a single randomly selected set…

  16. Investigating the use of quick response codes in the gross anatomy laboratory.

    PubMed

    Traser, Courtney J; Hoffman, Leslie A; Seifert, Mark F; Wilson, Adam B

    2015-01-01

    The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student performance, and evaluated whether performance could be explained by the frequency of QR code usage. Question prompts and QR codes tagged on cadaveric specimens and models were available for four weeks as learning aids to medical (n?=?155) and doctor of physical therapy (n?=?39) students. Each QR code provided answers to posed questions in the form of embedded text or hyperlinked web pages. Students' perceptions were gathered using a formative questionnaire and practical examination scores were used to assess potential gains in student achievement. Overall, students responded positively to the use of QR codes in the gross anatomy laboratory as 89% (57/64) agreed the codes augmented their learning of anatomy. The users' most noticeable objection to using QR codes was the reluctance to bring their smartphones into the gross anatomy laboratory. A comparison between the performance of QR code users and non-users was found to be nonsignificant (P?=?0.113), and no significant gains in performance (P?=?0.302) were observed after the intervention. Learners welcomed the implementation of QR code technology in the gross anatomy laboratory, yet this intervention had no apparent effect on practical examination performance. PMID:25288343

  17. A model-independent investigation on quasi-degenerate neutrino mass models and their significance

    NASA Astrophysics Data System (ADS)

    Roy, Subhankar; Singh, N. Nimai

    2013-12-01

    The prediction of possible hierarchy of neutrino masses mostly depends on the model chosen. Dissociating the μ-τ interchange symmetry from discrete flavor symmetry based models, makes the neutrino mass matrix less predictive and motivates one to seek the answer from different phenomenological frameworks. This insists on proper parametrization of the neutrino mass matrices concerning individual hierarchies. In this work, an attempt has been made to study the six different cases of quasi-degenerate (QDN) neutrino models with mass matrices, mLLν parametrized with two free parameters (α,η), standard Wolfenstein parameter (λ) and input mass scale, m0˜0.08 eV. We start with a μ-τ symmetric neutrino mass matrix followed by a correction from charged lepton sector. The parametrization emphasizes on the existence of four independent texture zero building blocks common to all the QDN models under μ-τ symmetric framework and is found to be invariant under any choice of solar angle. In our parametrization, solar angle is controlled from neutrino sector whereas the charged lepton sector drives the reactor and atmospheric mixing angles. The individual models are tested in the framework of oscillation experiments, cosmological observation and future experiments involving β-decay and 0νββ experiments, and any reason to discard the QDN mass models with relatively lower mass is unfounded. Although the QDNH-Type IA model shows strong preference for sin2θ12=0.32, yet this is not sufficient to rule out the other models. The present work leaves a scope to extend the search of most favorable QDN mass model from observed baryon asymmetry of the Universe.

  18. Further Investigation of Acoustic Propagation Codes for Three-Dimensional Geometries

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Watson, Willie R.; Jones, Michael G.

    2006-01-01

    The ability to predict fan noise within complex three-dimensional aircraft engine nacelle geometries is a valuable tool in designing and assessing low-noise concepts. This work begins a systematic study to identify the areas of the design space in which propagation codes of varying fidelity may be used effectively to provide efficient design and assessment. An efficient lower-fidelity code is used in conjunction with two higher-fidelity, more computationally intensive methods to solve benchmark problems of increasing complexity. The codes represent a small sampling of the current propagation codes available or under development. Results of this initial study indicate that the lower-fidelity code provides satisfactory results for cases involving low to moderate attenuation rates, whereas, the two higher-fidelity codes perform well across the range of problems.

  19. Investigation of low temperature solid oxide fuel cells for air-independent UUV applications

    NASA Astrophysics Data System (ADS)

    Moton, Jennie Mariko

    Unmanned underwater vehicles (UUVs) will benefit greatly from high energy density (> 500 Wh/L) power systems utilizing high-energy-density fuels and air-independent oxidizers. Current battery-based systems have limited energy densities (< 400 Wh/L), which motivate development of alternative power systems such as solid oxide fuel cells (SOFCs). SOFC-based power systems have the potential to achieve the required UUV energy densities, and the current study explores how SOFCs based on gadolinia-doped ceria (GDC) electrolytes with operating temperatures of 650°C and lower may operate in the unique environments of a promising UUV power plant. The plant would contain a H 2O2 decomposition reactor to supply humidified O2 to the SOFC cathode and exothermic aluminum/H2O combustor to provide heated humidified H2 fuel to the anode. To characterize low-temperature SOFC performance with these unique O2 and H2 source, SOFC button cells based on nickel/GDC (Gd0.1Ce0.9O 1.95) anodes, GDC electrolytes, and lanthanum strontium cobalt ferrite (La0.6Sr0.4Co0.2Fe0.8O3-δ or LSCF)/GDC cathodes were fabricated and tested for performance and stability with humidity on both the anode and the cathode. Cells were also tested with various reactant concentrations of H2 and O2 to simulate gas depletion down the channel of an SOFC stack. Results showed that anode performance depended primarily on fuel concentration and less on the concentration of the associated increase in product H2O. O 2 depletion with humidified cathode flows also caused significant loss in cell current density at a given voltage. With the humidified flows in either the anode or cathode, stability tests of the button cells at 650 °C showed stable voltage is maintained at low operating current (0.17 A/cm2) at up to 50 % by mole H2O, but at higher current densities (0.34 A/cm2), irreversible voltage degradation occurred at rates of 0.8-3.7 mV/hour depending on exposure time. From these button cell results, estimated average current densities over the length of a low-temperature SOFC stack were estimated and used to size a UUV power system based on Al/H 2O oxidation for fuel and H2O2 decomposition for O2. The resulting system design suggested that energy densities above 300 Wh/L may be achieved at neutral buoyancy with seawater if the cell is operated at high reactant utilizations in the SOFC stack for missions longer than 20 hours.

  20. Investigating the Use of Quick Response Codes in the Gross Anatomy Laboratory

    ERIC Educational Resources Information Center

    Traser, Courtney J.; Hoffman, Leslie A.; Seifert, Mark F.; Wilson, Adam B.

    2015-01-01

    The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student

  1. Investigating the Use of Quick Response Codes in the Gross Anatomy Laboratory

    ERIC Educational Resources Information Center

    Traser, Courtney J.; Hoffman, Leslie A.; Seifert, Mark F.; Wilson, Adam B.

    2015-01-01

    The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student…

  2. Retrospective investigation of gingival invaginations : Part I: Clinical findings and presentation of a coding system.

    PubMed

    Reichert, Christoph; Glz, Lina; Dirk, Cornelius; Jger, Andreas

    2012-08-01

    Many orthodontic treatments involve tooth extraction. Gingival invagination is a common side effect after orthodontic extraction space closure leading to compromised oral hygiene and the space closure being hampered. Even the long-term stability of the orthodontic treatment result may be jeopardized. The aim of this study was to identify risk factors for the development of gingival invagination and possible implications on oral health and orthodontic treatment results.A total of 30 patients presenting 101 tooth extractions and subsequent orthodontic space closure were investigated to detect the presence of gingival invagination. The time required until active space closure, the thoroughness of space closure, and probing depths mesial and distal to the extraction site in addition to age, gender and the Periodontal Screening Index were investigated. A new coding system to describe the extent of gingival invagination is introduced for the first time here.Gingival invagination developed more frequently in the lower jaw (50%) than the upper (30%). Complete penetration occurred in the upper jaw in 6% of the patients and in the lower jaw in 25%. All patients without gingival invagination revealed complete space closure, whereas only 70% in the group with gingival invagination did so. The time until initiation of space closure took significantly longer in patients with gingival invagination (7.5 1.4 months) than in patients without (3.3 0.8 months). Probing depths of the adjacent teeth were significantly greater in regions with invaginations.Thus, the time required until space closure was initiated and the extraction site are important risk factors for the development of gingival invagination. The consequences of gingival invagination are instable space closure and deeper probing depths mesial and distal to the extractions. However, no statements concerning the mid- to long-term effects on oral health can be made. PMID:22777163

  3. Agreement in participant-coded and investigator-coded food-record analysis in overweight research participants: an examination of interpretation bias.

    PubMed

    Bjorge-Schohl, Brooke; Johnston, Carol S; Trier, Catherine M; Fleming, Katie R

    2014-05-01

    Validation studies support the use of self-administered computerized methods for reporting energy intake; however, the degree of interpretation bias with these methods is unknown. This research compared nutrient intake for food records that were both participant coded (using the National Cancer Institute's Automated Self-Administered 24-hour recall [ASA24] online program) and investigator-coded (a single investigator coded all food records using the ESHA Food Processor diet analysis program). Participants (n=28; mean age=41±11 years; mean body mass index=31±6) were participants in an 8-week trial (conducted between March 2011 and June 2011 in Phoenix, AZ) investigating the impact of meal preloads on satiety. Food records were collected on four occasions during the trial and, of the food records available for this investigation (n=161), 88% were completed on a weekday. Intra-class correlation coefficients were computed for selected nutrients and ranged from 0.65 to 0.81 for the macronutrients and from 0.50 to 0.66 for the micronutrients (overall mean=0.67). Overall mean coefficient improved to 0.77 when the data from three or more food records per participant were averaged, as is commonly done in nutrition research. All intra-class correlation coefficients were significant (P<0.020) and were not impacted by the day of week that food was recorded. For energy, macronutrients, and minerals, the percent median differences between coders were <±17%; however, percent median differences were large for vitamin C (+27%) and beta carotene (+294%). Findings from this study suggest that self-administered dietary assessment has merit as a research tool. Pretrial training for research participants is suggested to reduce interpretation bias. PMID:24210517

  4. Training camp: The quest to become a new National Institutes of Health (NIH)-funded independent investigator

    NASA Astrophysics Data System (ADS)

    Sklare, Daniel A.

    2003-04-01

    This presentation will provide information on the research training and career development programs of the National Institute on Deafness and Other Communication Disorders (NIDCD). The predoctoral and postdoctoral fellowship (F30, F31, F32) programs and the research career development awards for clinically trained individuals (K08/K23) and for individuals trained in the quantitative sciences and in engineering (K25) will be highlighted. In addition, the role of the NIDCD Small Grant (R03) in transitioning postdoctoral-level investigators to research independence will be underscored.

  5. Investigating the structure preserving encryption of high efficiency video coding (HEVC)

    NASA Astrophysics Data System (ADS)

    Shahid, Zafar; Puech, William

    2013-02-01

    This paper presents a novel method for the real-time protection of new emerging High Efficiency Video Coding (HEVC) standard. Structure preserving selective encryption is being performed in CABAC entropy coding module of HEVC, which is significantly different from CABAC entropy coding of H.264/AVC. In CABAC of HEVC, exponential Golomb coding is replaced by truncated Rice (TR) up to a specific value for binarization of transform coefficients. Selective encryption is performed using AES cipher in cipher feedback mode on a plaintext of binstrings in a context aware manner. The encrypted bitstream has exactly the same bit-rate and is format complaint. Experimental evaluation and security analysis of the proposed algorithm is performed on several benchmark video sequences containing different combinations of motion, texture and objects.

  6. Computer models to support investigations of surface subsidence and associated ground motion induced by underground coal gasification. [STEALTH Codes

    SciTech Connect

    Langland, R.T.; Trent, B.C.

    1981-01-01

    Two computer codes compare surface subsidence induced by underground coal gasification at Hoe Creek, Wyoming, and Centralia, Washington. Calculations with the STEALTH explicit finite-difference code are shown to match equivalent, implicit finite-element method solutions for the removal of underground material. Effects of removing roof material, varying elastic constants, investigating thermal shrinkage, and burning multiple coal seams are studied. A coupled, finite-difference continuum rigid-block caving code is used to model underground opening behavior. Numerical techniques agree qualitatively with empirical studies but, so far, underpredict ground surface displacement. The two methods, numerical and empirical, are most effective when used together. It is recommended that the thermal characteristics of coal measure rock be investigated and that additional calculations be carried out to longer times so that cooling influences can be modeled.

  7. Dimensionality of ICA in resting-state fMRI investigated by feature optimized classification of independent components with SVM

    PubMed Central

    Wang, Yanlu; Li, Tie-Qiang

    2015-01-01

    Different machine learning algorithms have recently been used for assisting automated classification of independent component analysis (ICA) results from resting-state fMRI data. The success of this approach relies on identification of artifact components and meaningful functional networks. A limiting factor of ICA is the uncertainty of the number of independent components (NIC). We aim to develop a framework based on support vector machines (SVM) and optimized feature-selection for automated classification of independent components (ICs) and use the framework to investigate the effects of input NIC on the ICA results. Seven different resting-state fMRI datasets were studied. 18 features were devised by mimicking the empirical criteria for manual evaluation. The five most significant (p < 0.01) features were identified by general linear modeling and used to generate a classification model for the framework. This feature-optimized classification of ICs with SVM (FOCIS) framework was used to classify both group and single subject ICA results. The classification results obtained using FOCIS and previously published FSL-FIX were compared against manually evaluated results. On average the false negative rate in identifying artifact contaminated ICs for FOCIS and FSL-FIX were 98.27 and 92.34%, respectively. The number of artifact and functional network components increased almost linearly with the input NIC. Through tracking, we demonstrate that incrementing NIC affects most ICs when NIC < 33, whereas only a few limited ICs are affected by direct splitting when NIC is incremented beyond NIC > 40. For a given IC, its changes with increasing NIC are individually specific irrespective whether the component is a potential resting-state functional network or an artifact component. Using FOCIS, we investigated experimentally the ICA dimensionality of resting-state fMRI datasets and found that the input NIC can critically affect the ICA results of resting-state fMRI data. PMID:26005413

  8. Dimensionality of ICA in resting-state fMRI investigated by feature optimized classification of independent components with SVM.

    PubMed

    Wang, Yanlu; Li, Tie-Qiang

    2015-01-01

    Different machine learning algorithms have recently been used for assisting automated classification of independent component analysis (ICA) results from resting-state fMRI data. The success of this approach relies on identification of artifact components and meaningful functional networks. A limiting factor of ICA is the uncertainty of the number of independent components (NIC). We aim to develop a framework based on support vector machines (SVM) and optimized feature-selection for automated classification of independent components (ICs) and use the framework to investigate the effects of input NIC on the ICA results. Seven different resting-state fMRI datasets were studied. 18 features were devised by mimicking the empirical criteria for manual evaluation. The five most significant (p < 0.01) features were identified by general linear modeling and used to generate a classification model for the framework. This feature-optimized classification of ICs with SVM (FOCIS) framework was used to classify both group and single subject ICA results. The classification results obtained using FOCIS and previously published FSL-FIX were compared against manually evaluated results. On average the false negative rate in identifying artifact contaminated ICs for FOCIS and FSL-FIX were 98.27 and 92.34%, respectively. The number of artifact and functional network components increased almost linearly with the input NIC. Through tracking, we demonstrate that incrementing NIC affects most ICs when NIC < 33, whereas only a few limited ICs are affected by direct splitting when NIC is incremented beyond NIC > 40. For a given IC, its changes with increasing NIC are individually specific irrespective whether the component is a potential resting-state functional network or an artifact component. Using FOCIS, we investigated experimentally the ICA dimensionality of resting-state fMRI datasets and found that the input NIC can critically affect the ICA results of resting-state fMRI data. PMID:26005413

  9. Luminal long non-coding RNAs regulated by estrogen receptor alpha in a ligand-independent manner show functional roles in breast cancer.

    PubMed

    Miano, Valentina; Ferrero, Giulio; Reineri, Stefania; Caizzi, Livia; Annaratone, Laura; Ricci, Laura; Cutrupi, Santina; Castellano, Isabella; Cordero, Francesca; De Bortoli, Michele

    2016-01-19

    Estrogen Receptor alpha (ERα) activation by estrogenic hormones induces luminal breast cancer cell proliferation. However, ERα plays also important hormone-independent functions to maintain breast tumor cells epithelial phenotype. We reported previously by RNA-Seq that in MCF-7 cells in absence of hormones ERα down-regulation changes the expression of several genes linked to cellular development, representing a specific subset of estrogen-induced genes. Here, we report regulation of long non-coding RNAs from the same experimental settings. A list of 133 Apo-ERα-Regulated lncRNAs (AER-lncRNAs) was identified and extensively characterized using published data from cancer cell lines and tumor tissues, or experiments on MCF-7 cells. For several features, we ran validation using cell cultures or fresh tumor biopsies. AER-lncRNAs represent a specific subset, only marginally overlapping estrogen-induced transcripts, whose expression is largely restricted to luminal cells and which is able to perfectly classify breast tumor subtypes. The most abundant AER-lncRNA, DSCAM-AS1, is expressed in ERα+ breast carcinoma, but not in pre-neoplastic lesions, and correlates inversely with EMT markers. Down-regulation of DSCAM-AS1 recapitulated, in part, the effect of silencing ERα, i.e. growth arrest and induction of EMT markers. In conclusion, we report an ERα-dependent lncRNA set representing a novel luminal signature in breast cancer cells. PMID:26621851

  10. Luminal long non-coding RNAs regulated by estrogen receptor alpha in a ligand-independent manner show functional roles in breast cancer

    PubMed Central

    Miano, Valentina; Ferrero, Giulio; Reineri, Stefania; Caizzi, Livia; Annaratone, Laura; Ricci, Laura; Cutrupi, Santina; Castellano, Isabella; Cordero, Francesca; De Bortoli, Michele

    2016-01-01

    Estrogen Receptor alpha (ERα) activation by estrogenic hormones induces luminal breast cancer cell proliferation. However, ERα plays also important hormone-independent functions to maintain breast tumor cells epithelial phenotype. We reported previously by RNA-Seq that in MCF-7 cells in absence of hormones ERα down-regulation changes the expression of several genes linked to cellular development, representing a specific subset of estrogen-induced genes. Here, we report regulation of long non-coding RNAs from the same experimental settings. A list of 133 Apo-ERα-Regulated lncRNAs (AER-lncRNAs) was identified and extensively characterized using published data from cancer cell lines and tumor tissues, or experiments on MCF-7 cells. For several features, we ran validation using cell cultures or fresh tumor biopsies. AER-lncRNAs represent a specific subset, only marginally overlapping estrogen-induced transcripts, whose expression is largely restricted to luminal cells and which is able to perfectly classify breast tumor subtypes. The most abundant AER-lncRNA, DSCAM-AS1, is expressed in ERα+ breast carcinoma, but not in pre-neoplastic lesions, and correlates inversely with EMT markers. Down-regulation of DSCAM-AS1 recapitulated, in part, the effect of silencing ERα, i.e. growth arrest and induction of EMT markers. In conclusion, we report an ERα-dependent lncRNA set representing a novel luminal signature in breast cancer cells. PMID:26621851

  11. Investigation of Different Constituent Encoders in a Turbo-code Scheme for Reduced Decoder Complexity

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.

    1998-01-01

    A large number of papers have been published attempting to give some analytical basis for the performance of Turbo-codes. It has been shown that performance improves with increased interleaver length. Also procedures have been given to pick the best constituent recursive systematic convolutional codes (RSCC's). However testing by computer simulation is still required to verify these results. This thesis begins by describing the encoding and decoding schemes used. Next simulation results on several memory 4 RSCC's are shown. It is found that the best BER performance at low E(sub b)/N(sub o) is not given by the RSCC's that were found using the analytic techniques given so far. Next the results are given from simulations using a smaller memory RSCC for one of the constituent encoders. Significant reduction in decoding complexity is obtained with minimal loss in performance. Simulation results are then given for a rate 1/3 Turbo-code with the result that this code performed as well as a rate 1/2 Turbo-code as measured by the distance from their respective Shannon limits. Finally the results of simulations where an inaccurate noise variance measurement was used are given. From this it was observed that Turbo-decoding is fairly stable with regard to noise variance measurement.

  12. Simulative Investigation on Spectral Efficiency of Unipolar Codes based OCDMA System using Importance Sampling Technique

    NASA Astrophysics Data System (ADS)

    Farhat, A.; Menif, M.; Rezig, H.

    2013-09-01

    This paper analyses the spectral efficiency of Optical Code Division Multiple Access (OCDMA) system using Importance Sampling (IS) technique. We consider three configurations of OCDMA system namely Direct Sequence (DS), Spectral Amplitude Coding (SAC) and Fast Frequency Hopping (FFH) that exploits the Fiber Bragg Gratings (FBG) based encoder/decoder. We evaluate the spectral efficiency of the considered system by taking into consideration the effect of different families of unipolar codes for both coherent and incoherent sources. The results show that the spectral efficiency of OCDMA system with coherent source is higher than the incoherent case. We demonstrate also that DS-OCDMA outperforms both others in terms of spectral efficiency in all conditions.

  13. Role asymmetry and code transmission in signaling games: an experimental and computational investigation.

    PubMed

    Moreno, Maggie; Baggio, Giosuè

    2015-07-01

    In signaling games, a sender has private access to a state of affairs and uses a signal to inform a receiver about that state. If no common association of signals and states is initially available, sender and receiver must coordinate to develop one. How do players divide coordination labor? We show experimentally that, if players switch roles at each communication round, coordination labor is shared. However, in games with fixed roles, coordination labor is divided: Receivers adjust their mappings more frequently, whereas senders maintain the initial code, which is transmitted to receivers and becomes the common code. In a series of computer simulations, player and role asymmetry as observed experimentally were accounted for by a model in which the receiver in the first signaling round has a higher chance of adjusting its code than its partner. From this basic division of labor among players, certain properties of role asymmetry, in particular correlations with game complexity, are seen to follow. PMID:25352016

  14. Analytical Investigation on Papr Reduction in OFDM Systems Using Golay Codes

    NASA Astrophysics Data System (ADS)

    Uppal, Sabhyata; Sharma, Sanjay; Singh, Hardeep

    2014-09-01

    Orthogonal frequency division multiplexing (OFDM) is a common technique in multi carrier communications. One of the major issues in developing OFDM is the high peak to average power ratio (PAPR). Golay sequences have been introduced to construct 16-QAM and 256-QAM (quadrature amplitude modulation) code for the orthogonal frequency division multiplexing (OFDM), reducing the peak-to-average power ratio. In this paper we have considered the use of coding to reduce the peakto- average power ratio (PAPR) for orthogonal frequency division multiplexing (OFDM) systems. By using QPSK Golay sequences, 16 and 256 QAM sequences with low PAPR are generated

  15. Investigation of independence in inter-animal tumor-type occurrences within the NTP rodent-bioassay database

    SciTech Connect

    Bogen, K.T.; Seilkop, S.

    1993-05-01

    Statistically significant elevation in tumor incidence at multiple histologically distinct sites is occasionally observed among rodent bioassays of chemically induced carcinogenesis. If such data are to be relied on (as they have, e.g., by the US EPA) for quantitative cancer potency assessment, their proper analysis requires a knowledge of the extent to which multiple tumor-type occurrences are independent or uncorrelated within individual bioassay animals. Although difficult to assess in a statistically rigorous fashion, a few significant associations among tumor-type occurrences in rodent bioassays have been reported. However, no comprehensive studies of animal-specific tumor-type occurrences at death or sacrifice have been conducted using the extensive set of available NTP rodent-bioassay data, on which most cancer-potency assessment for environmental chemicals is currently based. This report presents the results of such an analysis conducted on behalf of the National Research Council`s Committee on Risk Assessment for Hazardous Air Pollutants. Tumor-type associations among individual animals were examined for {approximately}2500 to 3000 control and {approximately}200 to 600 treated animals using pathology data from 62 B6C3F1 mouse studies and 61 F/344N rat studies obtained from a readily available subset of the NTP carcinogenesis bioassay database. No evidence was found for any large correlation in either the onset probability or the prevalence-at-death or sacrifice of any tumor-type pair investigated in control and treated rats and niece, although a few of the small correlations present were statistically significant. Tumor-type occurrences were in most cases nearly independent, and departures from independence, where they did occur, were small. This finding is qualified in that tumor-type onset correlations were measured only indirectly, given the limited nature of the data analyzed.

  16. Investigating the Semantic Interoperability of Laboratory Data Exchanged Using LOINC Codes in Three Large Institutions

    PubMed Central

    Lin, Ming–Chin; Vreeman, Daniel J.; Huff, Stanley M.

    2011-01-01

    LOINC codes are seeing increased use in many organizations. In this study, we examined the barriers to semantic interoperability that still exist in electronic data exchange of laboratory results even when LOINC codes are being used as the observation identifiers. We analyzed semantic interoperability of laboratory data exchanged using LOINC codes in three large institutions. To simplify the analytic process, we divided the laboratory data into quantitative and non-quantitative tests. The analysis revealed many inconsistencies even when LOINC codes are used to exchange laboratory data. For quantitative tests, the most frequent problems were inconsistencies in the use of units of measure: variations in the strings used to represent units (unrecognized synonyms), use of units that result in different magnitudes of the numeric quantity, and missing units of measure. For non-quantitative tests, the most frequent problems were acronyms/synonyms, different classes of elements in enumerated lists, and the use of free text. Our findings highlight the limitations of interoperability in current laboratory reporting. PMID:22195138

  17. THE CODE OF THE STREET AND INMATE VIOLENCE: INVESTIGATING THE SALIENCE OF IMPORTED BELIEF SYSTEMS*

    PubMed Central

    MEARS, DANIEL P.; STEWART, ERIC A.; SIENNICK, SONJA E.; SIMONS, RONALD L.

    2013-01-01

    Scholars have long argued that inmate behaviors stem in part from cultural belief systems that they “import” with them into incarcerative settings. Even so, few empirical assessments have tested this argument directly. Drawing on theoretical accounts of one such set of beliefs—the code of the street—and on importation theory, we hypothesize that individuals who adhere more strongly to the street code will be more likely, once incarcerated, to engage in violent behavior and that this effect will be amplified by such incarceration experiences as disciplinary sanctions and gang involvement, as well as the lack of educational programming, religious programming, and family support. We test these hypotheses using unique data that include measures of the street code belief system and incarceration experiences. The results support the argument that the code of the street belief system affects inmate violence and that the effect is more pronounced among inmates who lack family support, experience disciplinary sanctions, and are gang involved. Implications of these findings are discussed. PMID:24068837

  18. Write to Read: Investigating the Reading-Writing Relationship of Code-Level Early Literacy Skills

    ERIC Educational Resources Information Center

    Jones, Cindy D.; Reutzel, D. Ray

    2015-01-01

    The purpose of this study was to examine whether the code-related features used in current methods of writing instruction in kindergarten classrooms transfer reading outcomes for kindergarten students. We randomly assigned kindergarten students to 3 instructional groups: a writing workshop group, an interactive writing group, and a control group.…

  19. Safety Related Investigations of the VVER-1000 Reactor Type by the Coupled Code System TRACE/PARCS

    NASA Astrophysics Data System (ADS)

    Jaeger, Wadim; Espinoza, Victor Hugo Sánchez; Lischke, Wolfgang

    This study was performed at the Institute of Reactor Safety at the Forschungszentrum Karlsruhe. It is embedded in the ongoing investigations of the international code assessment and maintenance program (CAMP) for qualification and validation of system codes like TRACE(1) and PARCS(2). The chosen reactor type used to validate these two codes was the Russian designed VVER-1000 because the OECD/NEA VVER-1000 Coolant Transient Benchmark Phase 2(3) includes detailed information of the Bulgarian nuclear power plant (NPP) Kozloduy unit 6. The post-test investigations of a coolant mixing experiment have shown that the predicted parameters (coolant temperature, pressure drop, etc.) are in good agreement with the measured data. The coolant mixing pattern, especially in the downcomer, has been also reproduced quiet well by TRACE. The coupled code system TRACE/PARCS which was applied on a postulated main steam line break (MSLB) provided good results compared to reference values and the ones of other participants of the benchmark. The results show that the developed three-dimensional nodalization of the reactor pressure vessel (RPV) is appropriate to describe the coolant mixing phenomena in the downcomer and the lower plenum of a VVER-1000 reactor. This phenomenon is a key issue for investigations of MSLB transient where the thermal hydraulics and the core neutronics are strongly linked. The simulation of the RPV and core behavior for postulated transients using the validated 3D TRACE RPV model, taking into account boundary conditions at vessel in- and outlet, indicates that the results are physically sound and in good agreement to other participant's results.

  20. Flight investigation of cockpit-displayed traffic information utilizing coded symbology in an advanced operational environment

    NASA Technical Reports Server (NTRS)

    Abbott, T. S.; Moen, G. C.; Person, L. H., Jr.; Keyser, G. L., Jr.; Yenni, K. R.; Garren, J. F., Jr.

    1980-01-01

    Traffic symbology was encoded to provide additional information concerning the traffic, which was displayed on the pilot's electronic horizontal situation indicators (EHSI). A research airplane representing an advanced operational environment was used to assess the benefit of coded traffic symbology in a realistic work-load environment. Traffic scenarios, involving both conflict-free and conflict situations, were employed. Subjective pilot commentary was obtained through the use of a questionnaire and extensive pilot debriefings. These results grouped conveniently under two categories: display factors and task performance. A major item under the display factor category was the problem of display clutter. The primary contributors to clutter were the use of large map-scale factors, the use of traffic data blocks, and the presentation of more than a few airplanes. In terms of task performance, the cockpit-displayed traffic information was found to provide excellent overall situation awareness. Additionally, mile separation prescribed during these tests.

  1. Independent Christian Schools and Pupil Values: An Empirical Investigation among 13-15-Year-Old Boys

    ERIC Educational Resources Information Center

    Francis, Leslie J.

    2005-01-01

    Nineteen independent Christian schools participated in the teenage religion and values survey, contributing to the overall database of nearly 34,000 Year 9 and Year 10 pupils. The present analysis demonstrates that 13-15-year-old boys educated within independent Christian schools display a distinctive values profile, in comparison with pupils…

  2. National evaluation of the benefits and risks of greater structuring and coding of the electronic health record: exploratory qualitative investigation

    PubMed Central

    Morrison, Zoe; Fernando, Bernard; Kalra, Dipak; Cresswell, Kathrin; Sheikh, Aziz

    2014-01-01

    Objective We aimed to explore stakeholder views, attitudes, needs, and expectations regarding likely benefits and risks resulting from increased structuring and coding of clinical information within electronic health records (EHRs). Materials and methods Qualitative investigation in primary and secondary care and research settings throughout the UK. Data were derived from interviews, expert discussion groups, observations, and relevant documents. Participants (n=70) included patients, healthcare professionals, health service commissioners, policy makers, managers, administrators, systems developers, researchers, and academics. Results Four main themes arose from our data: variations in documentation practice; patient care benefits; secondary uses of information; and informing and involving patients. We observed a lack of guidelines, co-ordination, and dissemination of best practice relating to the design and use of information structures. While we identified immediate benefits for direct care and secondary analysis, many healthcare professionals did not see the relevance of structured and/or coded data to clinical practice. The potential for structured information to increase patient understanding of their diagnosis and treatment contrasted with concerns regarding the appropriateness of coded information for patients. Conclusions The design and development of EHRs requires the capture of narrative information to reflect patient/clinician communication and computable data for administration and research purposes. Increased structuring and/or coding of EHRs therefore offers both benefits and risks. Documentation standards within clinical guidelines are likely to encourage comprehensive, accurate processing of data. As data structures may impact upon clinician/patient interactions, new models of documentation may be necessary if EHRs are to be read and authored by patients. PMID:24186957

  3. Utilization of a Photon Transport Code to Investigate Radiation Therapy Treatment Planning Quantities and Techniques.

    NASA Astrophysics Data System (ADS)

    Palta, Jatinder Raj

    A versatile computer program MORSE, based on neutron and photon transport theory has been utilized to investigate radiation therapy treatment planning quantities and techniques. A multi-energy group representation of transport equation provides a concise approach in utilizing Monte Carlo numerical techniques to multiple radiation therapy treatment planning problems. A general three dimensional geometry is used to simulate radiation therapy treatment planning problems in configurations of an actual clinical setting. Central axis total and scattered dose distributions for homogeneous and inhomogeneous water phantoms are calculated and the correction factor for lung and bone inhomogeneities are also evaluated. Results show that Monte Carlo calculations based on multi-energy group transport theory predict the depth dose distributions that are in good agreement with available experimental data. Improved correction factors based on the concepts of lung-air-ratio and bone-air-ratio are proposed in lieu of the presently used correction factors that are based on tissue-air-ratio power law method for inhomogeneity corrections. Central axis depth dose distributions for a bremsstrahlung spectrum from a linear accelerator is also calculated to exhibit the versatility of the computer program in handling multiple radiation therapy problems. A novel approach is undertaken to study the dosimetric properties of brachytherapy sources. Dose rate constants for various radionuclides are calculated from the numerically generated dose rate versus source energy curves. Dose rates can also be generated for any point brachytherapy source with any arbitrary energy spectrum at various radial distances from this family of curves.

  4. Performance investigation of the pulse and Campbelling modes of a fission chamber using a Poisson pulse train simulation code

    NASA Astrophysics Data System (ADS)

    Elter, Zs.; Jammes, C.; Pázsit, I.; Pál, L.; Filliatre, P.

    2015-02-01

    The detectors of the neutron flux monitoring system of the foreseen French GEN-IV sodium-cooled fast reactor (SFR) will be high temperature fission chambers placed in the reactor vessel in the vicinity of the core. The operation of a fission chamber over a wide-range neutron flux will be feasible provided that the overlap of the applicability of its pulse and Campbelling operational modes is ensured. This paper addresses the question of the linearity of these two modes and it also presents our recent efforts to develop a specific code for the simulation of fission chamber pulse trains. Our developed simulation code is described and its overall verification is shown. An extensive quantitative investigation was performed to explore the applicability limits of these two standard modes. It was found that for short pulses the overlap between the pulse and Campbelling modes can be guaranteed if the standard deviation of the background noise is not higher than 5% of the pulse amplitude. It was also shown that the Campbelling mode is sensitive to parasitic noise, while the performance of the pulse mode is affected by the stochastic amplitude distributions.

  5. Investigation of Nuclear Data Libraries with TRIPOLI-4 Monte Carlo Code for Sodium-cooled Fast Reactors

    NASA Astrophysics Data System (ADS)

    Lee, Y.-K.; Brun, E.

    2014-04-01

    The Sodium-cooled fast neutron reactor ASTRID is currently under design and development in France. Traditional ECCO/ERANOS fast reactor code system used for ASTRID core design calculations relies on multi-group JEFF-3.1.1 data library. To gauge the use of ENDF/B-VII.0 and JEFF-3.1.1 nuclear data libraries in the fast reactor applications, two recent OECD/NEA computational benchmarks specified by Argonne National Laboratory were calculated. Using the continuous-energy TRIPOLI-4 Monte Carlo transport code, both ABR-1000 MWth MOX core and metallic (U-Pu) core were investigated. Under two different fast neutron spectra and two data libraries, ENDF/B-VII.0 and JEFF-3.1.1, reactivity impact studies were performed. Using JEFF-3.1.1 library under the BOEC (Beginning of equilibrium cycle) condition, high reactivity effects of 808 17 pcm and 1208 17 pcm were observed for ABR-1000 MOX core and metallic core respectively. To analyze the causes of these differences in reactivity, several TRIPOLI-4 runs using mixed data libraries feature allow us to identify the nuclides and the nuclear data accounting for the major part of the observed reactivity discrepancies.

  6. Analysis of variable (diversity) joining recombination in DNAdependent protein kinase (DNA-PK)-deficient mice reveals DNA-PK-independent pathways for both signal and coding joint formation.

    PubMed

    Bogue, M A; Jhappan, C; Roth, D B

    1998-12-22

    Previous studies have suggested that ionizing radiation causes irreparable DNA double-strand breaks in mice and cell lines harboring mutations in any of the three subunits of DNA-dependent protein kinase (DNA-PK) (the catalytic subunit, DNA-PKcs, or one of the DNA-binding subunits, Ku70 or Ku86). In actuality, these mutants vary in their ability to resolve double-strand breaks generated during variable (diversity) joining [V(D)J] recombination. Mutant cell lines and mice with targeted deletions in Ku70 or Ku86 are severely compromised in their ability to form coding and signal joints, the products of V(D)J recombination. It is noteworthy, however, that severe combined immunodeficient (SCID) mice, which bear a nonnull mutation in DNA-PKcs, are substantially less impaired in forming signal joints than coding joints. The current view holds that the defective protein encoded by the murine SCID allele retains enough residual function to support signal joint formation. An alternative hypothesis proposes that DNA-PKcs and Ku perform different roles in V(D)J recombination, with DNA-PKcs required only for coding joint formation. To resolve this issue, we examined V(D)J recombination in DNA-PKcs-deficient (SLIP) mice. We found that the effects of this mutation on coding and signal joint formation are identical to the effects of the SCID mutation. Signal joints are formed at levels 10-fold lower than in wild type, and one-half of these joints are aberrant. These data are incompatible with the notion that signal joint formation in SCID mice results from residual DNA-PKcs function, and suggest a third possibility: that DNA-PKcs normally plays an important but nonessential role in signal joint formation. PMID:9861008

  7. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  8. Independent assessment of TRAC-PF1 (Version 7. 0), RELAP5/MOD1 (Cycle 14), and TRAC-BD1 (Version 12. 0) codes using separate-effects experiments

    SciTech Connect

    Saha, P; Jo, J H; Neymotin, L; Rohatgi, U S; Slovik, G C; Yuelys-Miksis, C

    1985-08-01

    This report presents the results of independent code assessment conducted at BNL. The TRAC-PF1 (Version 7.0) and RELAP5/MOD1 (Cycle 14) codes were assessed using the critical flow tests, level swell test, countercurrent flow limitation (CCFL) tests, post-CHF test, steam generator thermal performance tests, and natural circulation tests. TRAC-BD1 (Version 12.0) was applied only to the CCFL and post-CHF tests. The TRAC-PWR series of codes, i.e., TRAC-P1A, TRAC-PD2, and TRAC-PF1, have been gradually improved. However, TRAC-PF1 appears to need improvement in almost all categories of tests/phenomena attempted to BNL. Of the two codes, TRAC-PF1 and RELAP5/MOD1, the latter needs more improvement particularly in the areas of: CCFL, Level swell, CHF correlation and post-CHF heat transfer, and Numerical stability. For the CCFL and post-CHF tests, TRAC-BD1 provides the best overall results. However, the TRAC-BD1 interfacial shear package for the countercurrent annular flow regime needs further improvement for better prediction of CCFL phenomenon. 47 refs., 87 figs., 15 tabs.

  9. Industry and Occupation in the Electronic Health Record: An Investigation of the National Institute for Occupational Safety and Health Industry and Occupation Computerized Coding System

    PubMed Central

    2016-01-01

    Background Inclusion of information about a patient’s work, industry, and occupation, in the electronic health record (EHR) could facilitate occupational health surveillance, better health outcomes, prevention activities, and identification of workers’ compensation cases. The US National Institute for Occupational Safety and Health (NIOSH) has developed an autocoding system for “industry” and “occupation” based on 1990 Bureau of Census codes; its effectiveness requires evaluation in conjunction with promoting the mandatory addition of these variables to the EHR. Objective The objective of the study was to evaluate the intercoder reliability of NIOSH’s Industry and Occupation Computerized Coding System (NIOCCS) when applied to data collected in a community survey conducted under the Affordable Care Act; to determine the proportion of records that are autocoded using NIOCCS. Methods Standard Occupational Classification (SOC) codes are used by several federal agencies in databases that capture demographic, employment, and health information to harmonize variables related to work activities among these data sources. There are 359 industry and occupation responses that were hand coded by 2 investigators, who came to a consensus on every code. The same variables were autocoded using NIOCCS at the high and moderate criteria level. Results Kappa was .84 for agreement between hand coders and between the hand coder consensus code versus NIOCCS high confidence level codes for the first 2 digits of the SOC code. For 4 digits, NIOCCS coding versus investigator coding ranged from kappa=.56 to .70. In this study, NIOCCS was able to achieve production rates (ie, to autocode) 31%-36% of entered variables at the “high confidence” level and 49%-58% at the “medium confidence” level. Autocoding (production) rates are somewhat lower than those reported by NIOSH. Agreement between manually coded and autocoded data are “substantial” at the 2-digit level, but only “fair” to “good” at the 4-digit level. Conclusions This work serves as a baseline for performance of NIOCCS by investigators in the field. Further field testing will clarify NIOCCS effectiveness in terms of ability to assign codes and coding accuracy and will clarify its value as inclusion of these occupational variables in the EHR is promoted. PMID:26878932

  10. Chromatographic separation and multicollection-ICPMS analysis of iron. Investigating mass-dependent and -independent isotope effects.

    PubMed

    Dauphas, Nicolas; Janney, Philip E; Mendybaev, Ruslan A; Wadhwa, Meenakshi; Richter, Frank M; Davis, Andrew M; van Zuilen, Mark; Hines, Rebekah; Foley, C Nicole

    2004-10-01

    A procedure was developed that allows precise determination of Fe isotopic composition. Purification of Fe was achieved by ion chromatography on AG1-X8 strongly basic anion-exchange resin. No isotopic fractionation is associated with column chemistry within 0.02 per thousand /amu at 2sigma. The isotopic composition was measured with a Micromass IsoProbe multicollection inductively coupled plasma hexapole mass spectrometer. The Fe isotopic composition of the Orgueil CI1 carbonaceous chondrite, which best approximates the solar composition, is indistinguishable from that of IRMM-014 (-0.005 +/- 0.017 per thousand /amu). The IRMM-014 reference material is therefore used for normalization of the isotopic ratios. The protocol for analyzing mass-dependent variations is validated by measuring geostandards (IF-G, DTS-2, BCR-2, AGV-2) and heavily fractionated Fe left after vacuum evaporation of molten wüstite (FeO) and solar (MgO-Al(2)O(3)-SiO(2)-CaO-FeO in chondritic proportions) compositions. It is shown that the isotopic composition of Fe during evaporation of FeO follows a Rayleigh distillation with a fractionation factor alpha equal to (m(1)/m(2)()1/2), where m(1) and m(2) are the masses of the considered isotopes. This agrees with earlier measurements and theoretical expectations. The isotopic composition of Fe left after vacuum evaporation of solar composition also follows a Rayleigh distillation but with a fractionation factor (1.013 22 +/- 0.000 67 for the (56)Fe/(54)Fe ratio) that is lower than the square root of the masses (1.018 35). The protocol for analyzing mass-independent variations is validated by measuring terrestrial rocks that are not expected to show departure from mass-dependent fractionation. After internal normalization of the (57)Fe/(54)Fe ratio, the isotopic composition of Fe can be measured accurately with precisions of 0.2epsilon and 0.5epsilon at 2sigma for (56)Fe/(54)Fe and (58)Fe/(54)Fe ratios, respectively (epsilon refers to relative variations in parts per 10 000). For (58)Fe, this precision is an order of magnitude better than what had been achieved before. The method is applied to rocks that could potentially exhibit mass-independent effects, meteorites and Archaean terrestrial samples. The isotopic composition of a 3.8-Ga-old banded iron formation from Isua (IF-G, Greenland), and quartz-pyroxene rocks from Akilia and Innersuartuut (GR91-26 and SM/GR/171770, Greenland) are normal within uncertainties. Similarly, the Orgueil (CI1), Allende (CV3.2), Eagle Station (ESPAL), Brenham (MGPAL), and Old Woman (IIAB) meteorites do not show any mass-independent effect. PMID:15456307

  11. Experimental investigation of a 10-percent-thick helicopter rotor airfoil section designed with a viscous transonic analysis code

    NASA Technical Reports Server (NTRS)

    Noonan, K. W.

    1981-01-01

    An investigation was conducted in the Langley 6- by 28-Inch Transonic Tunnel to determine the two dimensional aerodynamic characteristics of a 10-percent-thick helicopter rotor airfoil at Mach numbers from 0.33 to 0.87 and respective Reynolds numbers from 4.9 x 10 to the 6th to 9.8 x 10 to the 6th. This airfoil, designated the RC-10(N)-1, was also investigated at Reynolds numbers from 3.0 x 10 to the 6th to 7.3 x 10 to the 6th at respective Mach numbers of 0.33 to 0.83 for comparison wit the SC 1095 (with tab) airfoil. The RC-10(N)-1 airfoil was designed by the use of a viscous transonic analysis code. The results of the investigation indicate that the RC-10(N)-1 airfoil met all the design goals. At a Reynolds number of about 9.4 x 10 to the 6th the drag divergence Mach number at zero normal-force coefficient was 0.815 with a corresponding pitching-moment coefficient of zero. The drag divergence Mach number at a normal-force coefficient of 0.9 and a Reynolds number of about 8.0 x 10 to the 6th was 0.61. The drag divergence Mach number of this new airfoil was higher than that of the SC 1095 airfoil at normal-force coefficients above 0.3. Measurements in the same wind tunnel at comparable Reynolds numbers indicated that the maximum normal-force coefficient of the RC-10(N)-1 airfoil was higher than that of the NACA 0012 airfoil for Mach numbers above about 0.35 and was about the same as that of the SC 1095 airfoil for Mach numbers up to 0.5.

  12. Use of mutant mouse lines to investigate origin of gonadotropin-releasing hormone-1 neurons: lineage independent of the adenohypophysis.

    PubMed

    Metz, Hillery; Wray, Susan

    2010-02-01

    Mutant mouse lines have been used to study the development of specific neuronal populations and brain structures as well as behaviors. In this report, single- and double-mutant mice were used to examine the lineage of GnRH-1 cells. GnRH is essential for vertebrate reproduction, with either GnRH-1 or GnRH-3 controlling release of gonadotropins from the anterior pituitary, depending on the species. It is clear that the neuroendocrine GnRH cells migrate from extracentral nervous system locations into the forebrain. However, the embryonic origin of GnRH-1 and GnRH-3 cells is controversial and has been suggested to be nasal placode, adenohypophyseal (anterior pituitary) placode, or neural crest, again dependent on the species examined. We found that mutant mice with either missing or disrupted anterior pituitaries (Gli2(-/-), Gli1(-/-)Gli2(-/-), and Lhx3(-/-)) exhibit a normal GnRH-1 neuronal population and that these cells are still found associated with the developing vomeronasal organ. These results indicate that in mice, GnRH-1 cells develop independent of the adenohypophyseal placode and are associated early with the formation of the nasal placode. PMID:20008041

  13. "Sample-Independent" Item Parameters? An Investigation of the Stability of IRT Item Parameters Estimated from Small Data Sets.

    ERIC Educational Resources Information Center

    Sireci, Stephen G.

    Whether item response theory (IRT) is useful to the small-scale testing practitioner is examined. The stability of IRT item parameters is evaluated with respect to the classical item parameters (i.e., p-values, biserials) obtained from the same data set. Previous research investigating the effect of sample size on IRT parameter estimation has…

  14. The power density spectra of some nB(n+1)B block coded signals

    NASA Astrophysics Data System (ADS)

    Morgenstern, G.

    1985-03-01

    Power density spectra of nB(n+1)B block coded signals were investigated. The coding laws of right 1B2B block codes, including the Miller code, CMI code, Manchester code and DMI code, and of a 3B4B and a 5B6B block code were compiled according to a general criterion as nB(n+1)B block codes. A 1B2B block code called AMAZI code was proposed. The power density spectra of the signals coded according to these codes were calculated for arbitrary pulse shapes and for rectangular pulses of the entire symbol length. Explicit formulas were derived and plotted. All calculations and results refer to binary input sequences where both amplitudes are equally probable and where the amplitudes of different binary elements are statistically independent.

  15. Independent Technical Investigation of the Puna Geothermal Venture Unplanned Steam Release, June 12 and 13, 1991, Puna, Hawaii

    SciTech Connect

    Thomas, Richard; Whiting, Dick; Moore, James; Milner, Duey

    1991-07-01

    On June 24, 1991, a third-party investigation team consisting of Richard P. Thomas, Duey E. Milner, James L. Moore, and Dick Whiting began an investigation into the blowout of well KS-8, which occurred at the Puna Geothermal Venture (PGV) site on June 12, 1991, and caused the unabated release of steam for a period of 31 hours before PGV succeeded in closing in the well. The scope of the investigation was to: (a) determine the cause(s) of the incident; (b) evaluate the adequacy of PGVs drilling and blowout prevention equipment and procedures; and (c) make recommendations for any appropriate changes in equipment and/or procedures. This report finds that the blowout occurred because of inadequacies in PGVs drilling plan and procedures and not as a result of unusual or unmanageable subsurface geologic or hydrologic conditions. While the geothermal resource in the area being drilled is relatively hot, the temperatures are not excessive for modem technology and methods to control. Fluid pressures encountered are also manageable if proper procedures are followed and the appropriate equipment is utilized. A previous blowout of short duration occurred on February 21, 1991, at the KS-7 injection well being drilled by PGV at a depth of approximately 1600'. This unexpected incident alerted PGV to the possibility of encountering a high temperature, fractured zone at a relatively shallow depth. The experience at KS-7 prompted PGV to refine its hydrological model; however, the drilling plan utilized for KS-8 was not changed. Not only did PGV fail to modify its drilling program following the KS-7 blowout, but they also failed to heed numerous ''red flags'' (warning signals) in the five days preceding the KS-8 blowout, which included a continuous 1-inch flow of drilling mud out of the wellbore, gains in mud volume while pulling stands, and gas entries while circulating muds bottoms up, in addition to lost circulation that had occurred earlier below the shoe of the 13-3/8-hch casing.

  16. Investigation of the mode competition in an He--Ne/CH/sub 4/ laser with independent variation of the mode spacing and spatial shift

    SciTech Connect

    Gubin, M.A.; Kozin, G.I.; Konovalov, I.P.; Nikitin, V.V.; Petrovskii, V.N.; Protsenko, E.D.; Rurukin, A.N.

    1982-06-01

    An experimental and theoretical investigation was made of the competition between the two axial modes in a two-mode He--Ne/CH/sub 4/ laser with independent variation of the mode spacing and spatial shift. The conditions were determined for obtaining the maximum contrast of the power resonances in an oscillator of this type. It was shown that the sensitivity of spectrometers utilizing such a laser can reach 10/sup -9/ cm/sup -1/ for approx.1 m lengths of absorbing medium.

  17. Investigation of ion toroidal rotation induced by Lower Hybrid waves in Alcator C-Mod using integrated numerical codes

    NASA Astrophysics Data System (ADS)

    Lee, Jungpyo; Wright, John; Bonoli, Paul; Parker, Ron; Catto, Peter; Podpaly, Yuri; Rice, John; Reinke, Matt; Parra, Felix

    2010-11-01

    Ion toroidal rotation in the counter current direction has been measured in C-Mod during lower hybrid (LH) RF power injection. Toroidal momentum input from the LH waves determines the initial increase of the counter current ion toroidal rotation. Due to the fast build up time of the plateau (< 1msec), the electron distribution function is assumed to be in steady state. We calculate the toroidal momentum input of LH wave to electrons by iterating a full wave code (TORIC-LH) with a Fokker Plank code (CQL3D) to obtain a self consistent steady state electron distribution function. On the longer time scale, comparable to the transport time (˜100msec), ion rotation is changing due to the constant momentum transfer from electrons to ions and the radial flux of ion toroidal momentum by Reynolds stress and collisional viscosity. We suggest a way to evaluate the viscosity terms for the low flow level rotation bya modifielectrostatic gyrokinetic code.

  18. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the coding techniques are equally applicable to any voice signal whether or not it carries any intelligible information, as the term speech implies. Other terms that are commonly used are speech compression and voice compression since the fundamental idea behind speech coding is to reduce (compress) the transmission rate (or equivalently the bandwidth) And/or reduce storage requirements In this document the terms speech and voice shall be used interchangeably.

  19. Inter-Sentential Patterns of Code-Switching: A Gender-Based Investigation of Male and Female EFL Teachers

    ERIC Educational Resources Information Center

    Gulzar, Malik Ajmal; Farooq, Muhammad Umar; Umer, Muhammad

    2013-01-01

    This article has sought to contribute to discussions concerning the value of inter-sentential patterns of code-switching (henceforth ISPCS) particularly in the context of EFL classrooms. Through a detailed analysis of recorded data produced in that context, distinctive features in the discourse were discerned which were associated with males' and…

  20. Are Independent Probes Truly Independent?

    ERIC Educational Resources Information Center

    Camp, Gino; Pecher, Diane; Schmidt, Henk G.; Zeelenberg, Rene

    2009-01-01

    The independent cue technique has been developed to test traditional interference theories against inhibition theories of forgetting. In the present study, the authors tested the critical criterion for the independence of independent cues: Studied cues not presented during test (and unrelated to test cues) should not contribute to the retrieval…

  1. Binary primitive alternant codes

    NASA Technical Reports Server (NTRS)

    Helgert, H. J.

    1975-01-01

    In this note we investigate the properties of two classes of binary primitive alternant codes that are generalizations of the primitive BCH codes. For these codes we establish certain equivalence and invariance relations and obtain values of d and d*, the minimum distances of the prime and dual codes.

  2. Multigenerational Independent Colony for Extraterrestrial Habitation, Autonomy, and Behavior Health (MICEHAB): An Investigation of a Long Duration, Partial Gravity, Autonomous Rodent Colony

    NASA Technical Reports Server (NTRS)

    Simon, Matthew A.; Jones, Christopher A.; Stillwagen, Frederic H.; Williams, Phillip A.; Hernandez, Joel; Lewis, Weston; Wusk, Grace; Rodgers, Erica M.; Antol, Jeffrey; Chai, Patrick R.; Klovstad, Jordan J.; Neilan, James H.; Bednara, Michael; Guendel, Alex; Lim, Jeremy; Wilson, Logan

    2015-01-01

    The path from Earth to Mars requires exploration missions to be increasingly Earth-independent as the foundation is laid for a sustained human presence in the following decades. NASA pioneering of Mars will expand the boundaries of human exploration, as a sustainable presence on the surface requires humans to successfully reproduce in a partial gravity environment independent from Earth intervention. Before significant investment is made in capabilities leading to such pioneering efforts, the challenges of multigenerational mammalian reproduction in a partial gravity environment need be investigated. The Multi-generational Independent Colony for Extraterrestrial Habitation, Autonomy, and Behavior health is designed to study these challenges. The proposed concept is a conceptual, long duration, autonomous habitat designed to house rodents in a partial gravity environment with the goal of understanding the effects of partial gravity on mammalian reproduction over multiple generations and how to effectively design such a facility to operate autonomously while keeping the rodents healthy in order to achieve multiple generations. All systems are designed to feed forward directly to full-scale human missions to Mars. This paper presents the baseline design concept formulated after considering challenges in the mission and vehicle architectures such as: vehicle automation, automated crew health management/medical care, unique automated waste disposal and hygiene, handling of deceased crew members, reliable long-duration crew support systems, and radiation protection. This concept was selected from an architectural trade space considering the balance between mission science return and robotic and autonomy capabilities. The baseline design is described in detail including: transportation and facility operation constraints, artificial gravity system design, habitat design, and a full-scale mock-up demonstration of autonomous rodent care facilities. The proposed concept has the potential to integrate into existing mission architectures in order to achieve exploration objectives, and to demonstrate and mature common capabilities that enable a range of destinations and missions.

  3. Polyphasic Study of the Spatial Distribution of Microorganisms in Mexican Pozol, a Fermented Maize Dough, Demonstrates the Need for Cultivation-Independent Methods To Investigate Traditional Fermentations

    PubMed Central

    Ampe, Frédéric; ben Omar, Nabil; Moizan, Claire; Wacher, Carmen; Guyot, Jean-Pierre

    1999-01-01

    The distribution of microorganisms in pozol balls, a fermented maize dough, was investigated by a polyphasic approach in which we used both culture-dependent and culture-independent methods, including microbial enumeration, fermentation product analysis, quantification of microbial taxa with 16S rRNA-targeted oligonucleotide probes, determination of microbial fingerprints by denaturing gradient gel electrophoresis (DGGE), and 16S ribosomal DNA gene sequencing. Our results demonstrate that DGGE fingerprinting and rRNA quantification should allow workers to precisely and rapidly characterize the microbial assemblage in a spontaneous lactic acid fermented food. Lactic acid bacteria (LAB) accounted for 90 to 97% of the total active microflora; no streptococci were isolated, although members of the genus Streptococcus accounted for 25 to 50% of the microflora. Lactobacillus plantarum and Lactobacillus fermentum, together with members of the genera Leuconostoc and Weissella, were the other dominant organisms. The overall activity was more important at the periphery of a ball, where eucaryotes, enterobacteria, and bacterial exopolysacharide producers developed. Our results also showed that the metabolism of heterofermentative LAB was influenced in situ by the distribution of the LAB in the pozol ball, whereas homolactic fermentation was controlled primarily by sugar limitation. We propose that starch is first degraded by amylases from LAB and that the resulting sugars, together with the lactate produced, allow a secondary flora to develop in the presence of oxygen. Our results strongly suggest that cultivation-independent methods should be used to study traditional fermented foods. PMID:10584005

  4. An investigation for population maintenance mechanism in a miniature garden: genetic connectivity or independence of small islet populations of the Ryukyu five-lined skink.

    PubMed

    Kurita, Kazuki; Hikida, Tsutomu; Toda, Mamoru

    2014-01-01

    The Ryukyu five-lined skink (Plestiodon marginatus) is an island lizard that is even found in tiny islets with less than half a hectare of habitat area. We hypothesized that the island populations are maintained under frequent gene flow among the islands or independent of each other. To test our hypotheses, we investigated genetic structure of 21 populations from 11 land-bridge islands that were connected during the latest glacial age, and 4 isolated islands. Analyses using mitochondrial cytochrome b gene sequence (n = 67) and 10 microsatellite loci (n = 235) revealed moderate to high levels of genetic differentiation, existence of many private alleles/haplotypes in most islands, little contemporary migration, a positive correlation between genetic variability and island area, and a negative correlation between relatedness and island area. These evidences suggest a strong effect of independent genetic drift as opposed to gene flow, favoring the isolation hypothesis even in tiny islet populations. Isolation-by-distance effect was demonstrated and it became more prominent when the 4 isolated islands were excluded, suggesting that the pattern is a remnant of the land-bridge age. In a few island populations, however, the possibility of occasional overwater dispersals was partially supported and therefore could not be ruled out. PMID:25189776

  5. Supporting the Cybercrime Investigation Process: Effective Discrimination of Source Code Authors Based on Byte-Level Information

    NASA Astrophysics Data System (ADS)

    Frantzeskou, Georgia; Stamatatos, Efstathios; Gritzalis, Stefanos

    Source code authorship analysis is the particular field that attempts to identify the author of a computer program by treating each program as a linguistically analyzable entity. This is usually based on other undisputed program samples from the same author. There are several cases where the application of such a method could be of a major benefit, such as tracing the source of code left in the system after a cyber attack, authorship disputes, proof of authorship in court, etc. In this paper, we present our approach which is based on byte-level n-gram profiles and is an extension of a method that has been successfully applied to natural language text authorship attribution. We propose a simplified profile and a new similarity measure which is less complicated than the algorithm followed in text authorship attribution and it seems more suitable for source code identification since is better able to deal with very small training sets. Experiments were performed on two different data sets, one with programs written in C++ and the second with programs written in Java. Unlike the traditional language-dependent metrics used by previous studies, our approach can be applied to any programming language with no additional cost. The presented accuracy rates are much better than the best reported results for the same data sets.

  6. Independence of Internal Auditors.

    ERIC Educational Resources Information Center

    Montondon, Lucille; Meixner, Wilda F.

    1993-01-01

    A survey of 288 college and university auditors investigated patterns in their appointment, reporting, and supervisory practices as indicators of independence and objectivity. Results indicate a weakness in the positioning of internal auditing within institutions, possibly compromising auditor independence. Because the auditing function is…

  7. Unfolding the color code

    NASA Astrophysics Data System (ADS)

    Kubica, Aleksander; Yoshida, Beni; Pastawski, Fernando

    2015-08-01

    The topological color code and the toric code are two leading candidates for realizing fault-tolerant quantum computation. Here we show that the color code on a d-dimensional closed manifold is equivalent to multiple decoupled copies of the d-dimensional toric code up to local unitary transformations and adding or removing ancilla qubits. Our result not only generalizes the proven equivalence for d = 2, but also provides an explicit recipe of how to decouple independent components of the color code, highlighting the importance of colorability in the construction of the code. Moreover, for the d-dimensional color code with d+1 boundaries of d+1 distinct colors, we find that the code is equivalent to multiple copies of the d-dimensional toric code which are attached along a (d-1)-dimensional boundary. In particular, for d = 2, we show that the (triangular) color code with boundaries is equivalent to the (folded) toric code with boundaries. We also find that the d-dimensional toric code admits logical non-Pauli gates from the dth level of the Clifford hierarchy, and thus saturates the bound by Bravyi and Knig. In particular, we show that the logical d-qubit control-Z gate can be fault-tolerantly implemented on the stack of d copies of the toric code by a local unitary transformation.

  8. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  9. MORSE Monte Carlo code

    SciTech Connect

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.

  10. Independent Living.

    ERIC Educational Resources Information Center

    Nathanson, Jeanne H., Ed.

    1994-01-01

    This issue of "OSERS" addresses the subject of independent living of individuals with disabilities. The issue includes a message from Judith E. Heumann, the Assistant Secretary of the Office of Special Education and Rehabilitative Services (OSERS), and 10 papers. Papers have the following titles and authors: "Changes in the Rehabilitation Act of…

  11. Numerical investigations on pressurized AL-composite vessel response to hypervelocity impacts: Comparison between experimental works and a numerical code

    NASA Astrophysics Data System (ADS)

    Mespoulet, Jérôme; Plassard, Fabien; Hereil, Pierre-Louis

    2015-09-01

    Response of pressurized composite-Al vessels to hypervelocity impact of aluminum spheres have been numerically investigated to evaluate the influence of initial pressure on the vulnerability of these vessels. Investigated tanks are carbon-fiber overwrapped prestressed Al vessels. Explored internal air pressure ranges from 1 bar to 300 bar and impact velocity are around 4400 m/s. Data obtained from experiments (Xray radiographies, particle velocity measurement and post-mortem vessels) have been compared to numerical results given from LS-DYNA ALE-Lagrange-SPH full coupling models. Simulations exhibit an under estimation in term of debris cloud evolution and shock wave propagation in pressurized air but main modes of damage/rupture on the vessels given by simulations are coherent with post-mortem recovered vessels from experiments. First results of this numerical work are promising and further simulation investigations with additional experimental data will be done to increase the reliability of the simulation model. The final aim of this crossed work is to numerically explore a wide range of impact conditions (impact angle, projectile weight, impact velocity, initial pressure) that cannot be explore experimentally. Those whole results will define a rule of thumbs for the definition of a vulnerability analytical model for a given pressurized vessel.

  12. Extension of the supercritical carbon dioxide brayton cycle to low reactor power operation: investigations using the coupled anl plant dynamics code-SAS4A/SASSYS-1 liquid metal reactor code system.

    SciTech Connect

    Moisseytsev, A.; Sienicki, J. J.

    2012-05-10

    Significant progress has been made on the development of a control strategy for the supercritical carbon dioxide (S-CO{sub 2}) Brayton cycle enabling removal of power from an autonomous load following Sodium-Cooled Fast Reactor (SFR) down to decay heat levels such that the S-CO{sub 2} cycle can be used to cool the reactor until decay heat can be removed by the normal shutdown heat removal system or a passive decay heat removal system such as Direct Reactor Auxiliary Cooling System (DRACS) loops with DRACS in-vessel heat exchangers. This capability of the new control strategy eliminates the need for use of a separate shutdown heat removal system which might also use supercritical CO{sub 2}. It has been found that this capability can be achieved by introducing a new control mechanism involving shaft speed control for the common shaft joining the turbine and two compressors following reduction of the load demand from the electrical grid to zero. Following disconnection of the generator from the electrical grid, heat is removed from the intermediate sodium circuit through the sodium-to-CO{sub 2} heat exchanger, the turbine solely drives the two compressors, and heat is rejected from the cycle through the CO{sub 2}-to-water cooler. To investigate the effectiveness of shaft speed control, calculations are carried out using the coupled Plant Dynamics Code-SAS4A/SASSYS-1 code for a linear load reduction transient for a 1000 MWt metallic-fueled SFR with autonomous load following. No deliberate motion of control rods or adjustment of sodium pump speeds is assumed to take place. It is assumed that the S-CO{sub 2} turbomachinery shaft speed linearly decreases from 100 to 20% nominal following reduction of grid load to zero. The reactor power is calculated to autonomously decrease down to 3% nominal providing a lengthy window in time for the switchover to the normal shutdown heat removal system or for a passive decay heat removal system to become effective. However, the calculations reveal that the compressor conditions are calculated to approach surge such that the need for a surge control system for each compressor is identified. Thus, it is demonstrated that the S-CO{sub 2} cycle can operate in the initial decay heat removal mode even with autonomous reactor control. Because external power is not needed to drive the compressors, the results show that the S-CO{sub 2} cycle can be used for initial decay heat removal for a lengthy interval in time in the absence of any off-site electrical power. The turbine provides sufficient power to drive the compressors. Combined with autonomous reactor control, this represents a significant safety advantage of the S-CO{sub 2} cycle by maintaining removal of the reactor power until the core decay heat falls to levels well below those for which the passive decay heat removal system is designed. The new control strategy is an alternative to a split-shaft layout involving separate power and compressor turbines which had previously been identified as a promising approach enabling heat removal from a SFR at low power levels. The current results indicate that the split-shaft configuration does not provide any significant benefits for the S-CO{sub 2} cycle over the current single-shaft layout with shaft speed control. It has been demonstrated that when connected to the grid the single-shaft cycle can effectively follow the load over the entire range. No compressor speed variation is needed while power is delivered to the grid. When the system is disconnected from the grid, the shaft speed can be changed as effectively as it would be with the split-shaft arrangement. In the split-shaft configuration, zero generator power means disconnection of the power turbine, such that the resulting system will be almost identical to the single-shaft arrangement. Without this advantage of the split-shaft configuration, the economic benefits of the single-shaft arrangement, provided by just one turbine and lower losses at the design point, are more important to the overall cycle performance. Therefore, the single-shaft configuration shall be retained as the reference arrangement for S-CO{sub 2} cycle power converter preconceptual designs. Improvements to the ANL Plant Dynamics Code have been carried out. The major code improvement is the introduction of a restart capability which simplifies investigation of control strategies for very long transients. Another code modification is transfer of the entire code to a new Intel Fortran complier; the execution of the code using the new compiler was verified by demonstrating that the same results are obtained as when the previous Compaq Visual Fortran compiler was used.

  13. On the Minimum Weight of Simple Full-Length Array LDPC Codes

    NASA Astrophysics Data System (ADS)

    Sugiyama, Kenji; Kaji, Yuichi

    We investigate the minimum weights of simple full-length array LDPC codes (SFA-LDPC codes). The SFA-LDPC codes are a subclass of LDPC codes, and constructed algebraically according to two integer parameters p and j. Mittelholzer and Yang et al. have studied the minimum weights of SFA-LDPC codes, but the exact minimum weights of the codes are not known except for some small p and j. In this paper, we show that the minimum weights of the SFA-LDPC codes with j=4 and j=5 are upper-bounded by 10 and 12, respectively, independent from the prime number p. By combining the results with Yang's lower-bound limits, we can conclude that the minimum weights of the SFA-LDPC codes with j=4 and p>7 are exactly 10 and those of the SFA-LDPC codes with j=5 are 10 or 12.

  14. System and method for investigating sub-surface features of a rock formation with acoustic sources generating coded signals

    DOEpatents

    Vu, Cung Khac; Nihei, Kurt; Johnson, Paul A; Guyer, Robert; Ten Cate, James A; Le Bas, Pierre-Yves; Larmat, Carene S

    2014-12-30

    A system and a method for investigating rock formations includes generating, by a first acoustic source, a first acoustic signal comprising a first plurality of pulses, each pulse including a first modulated signal at a central frequency; and generating, by a second acoustic source, a second acoustic signal comprising a second plurality of pulses. A receiver arranged within the borehole receives a detected signal including a signal being generated by a non-linear mixing process from the first-and-second acoustic signal in a non-linear mixing zone within the intersection volume. The method also includes-processing the received signal to extract the signal generated by the non-linear mixing process over noise or over signals generated by a linear interaction process, or both.

  15. Application of a multi-block CFD code to investigate the impact of geometry modeling on centrifugal compressor flow field predictions

    SciTech Connect

    Hathaway, M.D.; Wood, J.R.

    1997-10-01

    CFD codes capable of utilizing multi-block grids provide the capability to analyze the complete geometry of centrifugal compressors. Attendant with this increased capability is potentially increased grid setup time and more computational overhead with the resultant increase in wall clock time to obtain a solution. If the increase in difficulty of obtaining a solution significantly improves the solution from that obtained by modeling the features of the tip clearance flow or the typical bluntness of a centrifugal compressor`s trailing edge, then the additional burden is worthwhile. However, if the additional information obtained is of marginal use, then modeling of certain features of the geometry may provide reasonable solutions for designers to make comparative choices when pursuing a new design. In this spirit a sequence of grids were generated to study the relative importance of modeling versus detailed gridding of the tip gap and blunt trailing edge regions of the NASA large low-speed centrifugal compressor for which there is considerable detailed internal laser anemometry data available for comparison. The results indicate: (1) There is no significant difference in predicted tip clearance mass flow rate whether the tip gap is gridded or modeled. (2) Gridding rather than modeling the trailing edge results in better predictions of some flow details downstream of the impeller, but otherwise appears to offer no great benefits. (3) The pitchwise variation of absolute flow angle decreases rapidly up to 8% impeller radius ratio and much more slowly thereafter. Although some improvements in prediction of flow field details are realized as a result of analyzing the actual geometry there is no clear consensus that any of the grids investigated produced superior results in every case when compared to the measurements. However, if a multi-block code is available, it should be used, as it has the propensity for enabling better predictions than a single block code.

  16. An Investigation into Reliability of Knee Extension Muscle Strength Measurements, and into the Relationship between Muscle Strength and Means of Independent Mobility in the Ward: Examinations of Patients Who Underwent Femoral Neck Fracture Surgery.

    PubMed

    Katoh, Munenori; Kaneko, Yoshihiro

    2014-01-01

    [Purpose] The purpose of the present study was to investigate the reliability of isometric knee extension muscle strength measurement of patients who underwent femoral neck fracture surgery, as well as the relationship between independent mobility in the ward and knee muscle strength. [Subjects] The subjects were 75 patients who underwent femoral neck fracture surgery. [Methods] We used a hand-held dynamometer and a belt to measure isometric knee extension muscle strength three times, and used intraclass correlation coefficients (ICCs) to investigate the reliability of the measurements. We used a receiver operating characteristic curve to investigate the cutoff values for independent walking with walking sticks and non-independent mobility. [Results] ICCs (1, 1) were 0.9 or higher. The cutoff value for independent walking with walking sticks was 0.289 kgf/kg on the non-fractured side, 0.193 kgf/kg on the fractured side, and the average of both limbs was 0.238 kgf/kg. [Conclusion] We consider that the test-retest reliability of isometric knee extension muscle strength measurement of patients who have undergone femoral neck fracture surgery is high. We also consider that isometric knee extension muscle strength is useful for investigating means of independent mobility in the ward. PMID:24567667

  17. 'Independence' Panorama

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Click on the image for 'Independence' Panorama (QTVR)

    This is the Spirit 'Independence' panorama, acquired on martian days, or sols, 536 to 543 (July 6 to 13, 2005), from a position in the 'Columbia Hills' near the summit of 'Husband Hill.' The summit of 'Husband Hill' is the peak near the right side of this panorama and is about 100 meters (328 feet) away from the rover and about 30 meters (98 feet) higher in elevation. The rocky outcrops downhill and on the left side of this mosaic include 'Larry's Lookout' and 'Cumberland Ridge,' which Spirit explored in April, May, and June of 2005.

    The panorama spans 360 degrees and consists of 108 individual images, each acquired with five filters of the rover's panoramic camera. The approximate true color of the mosaic was generated using the camera's 750-, 530-, and 480-nanometer filters. During the 8 martian days, or sols, that it took to acquire this image, the lighting varied considerably, partly because of imaging at different times of sol, and partly because of small sol-to-sol variations in the dustiness of the atmosphere. These slight changes produced some image seams and rock shadows. These seams have been eliminated from the sky portion of the mosaic to better simulate the vista a person standing on Mars would see. However, it is often not possible or practical to smooth out such seams for regions of rock, soil, rover tracks or solar panels. Such is the nature of acquiring and assembling large panoramas from the rovers.

  18. Developing Research Skills: Independent Research Projects on Animals and Plants for Building the Research Skills of Report Writing, Mind Mapping, and Investigating through Inquiries. Revised Edition.

    ERIC Educational Resources Information Center

    Banks, Janet Caudill

    This book presents a collection of motivating, independent activities that involve animals and plants for use in developing the research skills of students in grades 2-6. Projects included in the book cover various levels of difficulty and are designed to promote higher-level thinking skills. Research components included in the activities in the…

  19. Phase II Evaluation of Clinical Coding Schemes

    PubMed Central

    Campbell, James R.; Carpenter, Paul; Sneiderman, Charles; Cohn, Simon; Chute, Christopher G.; Warren, Judith

    1997-01-01

    Abstract Objective: To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). Methods: The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for “parent” and “child” codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. Results: SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p <.00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56; UMLS 3.17; READ 2.14, *p <.005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p <. 00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p <. 004) associated with a loss of clarity. Conclusion: No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. It suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record. PMID:9147343

  20. Permutation codes for sources.

    NASA Technical Reports Server (NTRS)

    Berger, T.; Jelinek, F.; Wolf, J. K.

    1972-01-01

    Source encoding techniques based on permutation codes are investigated. For a broad class of distortion measures it is shown that optimum encoding of a source permutation code is easy to instrument even for very long block lengths. Also, the nonparametric nature of permutation encoding is well suited to situations involving unknown source statistics. For the squared-error distortion measure a procedure for generating good permutation codes of a given rate and block length is described. The performance of such codes for a memoryless Gaussian source is compared both with the rate-distortion function bound and with the performance of various quantization schemes. The comparison reveals that permutation codes are asymptotically ideal for small rates and perform as well as the best entropy-coded quantizers presently known for intermediate rates. They can be made to compare favorably at high rates, too, provided the coding delay associated with extremely long block lengths is tolerable.

  1. An extended version of the SERPENT-2 code to investigate fuel burn-up and core material evolution of the Molten Salt Fast Reactor

    NASA Astrophysics Data System (ADS)

    Aufiero, M.; Cammi, A.; Fiorina, C.; Leppänen, J.; Luzzi, L.; Ricotti, M. E.

    2013-10-01

    In this work, the Monte Carlo burn-up code SERPENT-2 has been extended and employed to study the material isotopic evolution of the Molten Salt Fast Reactor (MSFR). This promising GEN-IV nuclear reactor concept features peculiar characteristics such as the on-line fuel reprocessing, which prevents the use of commonly available burn-up codes. Besides, the presence of circulating nuclear fuel and radioactive streams from the core to the reprocessing plant requires a precise knowledge of the fuel isotopic composition during the plant operation. The developed extension of SERPENT-2 directly takes into account the effects of on-line fuel reprocessing on burn-up calculations and features a reactivity control algorithm. It is here assessed against a dedicated version of the deterministic ERANOS-based EQL3D procedure (PSI-Switzerland) and adopted to analyze the MSFR fuel salt isotopic evolution. Particular attention is devoted to study the effects of reprocessing time constants and efficiencies on the conversion ratio and the molar concentration of elements relevant for solubility issues (e.g., trivalent actinides and lanthanides). Quantities of interest for fuel handling and safety issues are investigated, including decay heat and activities of hazardous isotopes (neutron and high energy gamma emitters) in the core and in the reprocessing stream. The radiotoxicity generation is also analyzed for the MSFR nominal conditions. The production of helium and the depletion in tungsten content due to nuclear reactions are calculated for the nickel-based alloy selected as reactor structural material of the MSFR. These preliminary evaluations can be helpful in studying the radiation damage of both the primary salt container and the axial reflectors.

  2. Speech coding

    NASA Astrophysics Data System (ADS)

    Gersho, Allen

    1990-05-01

    Recent advances in algorithms and techniques for speech coding now permit high quality voice reproduction at remarkably low bit rates. The advent of powerful single-ship signal processors has made it cost effective to implement these new and sophisticated speech coding algorithms for many important applications in voice communication and storage. Some of the main ideas underlying the algorithms of major interest today are reviewed. The concept of removing redundancy by linear prediction is reviewed, first in the context of predictive quantization or DPCM. Then linear predictive coding, adaptive predictive coding, and vector quantization are discussed. The concepts of excitation coding via analysis-by-synthesis, vector sum excitation codebooks, and adaptive postfiltering are explained. The main idea of vector excitation coding (VXC) or code excited linear prediction (CELP) are presented. Finally low-delay VXC coding and phonetic segmentation for VXC are described.

  3. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  4. True uniaxial compressive strengths of rock or coal specimens are independent of diameter-to-length ratios. Report of Investigations/1990

    SciTech Connect

    Babcock, C.O.

    1990-01-01

    Part of the compressive strength of a test specimen of rock or coal in the laboratory or a pillar in a mine comes from physical property strength and, in part, from the constraint provided by the loading stresses. Much confusion in pillar design comes from assigning the total strength change to geometry, as evidenced by the many pillar design equations with width to height as the primary variable. In tests by the U.S. Bureau of Mines, compressive strengths for cylindrical specimens of limestone, marble, sandstone, and coal were independent of the specimen test geometry when the end friction was removed. A conventional uniaxial compressive strength test between two steel platens is actually a uniaxial force and not a uniaxial stress test. The biaxial or triaxial state of stress for much of the test volume changes with the geometry of the test specimen. By removing the end friction supplied by the steel platens to the specimen, a more nearly uniaxial stress state independent of the specimen geometry is produced in the specimen. Pillar design is a constraint and physical property problem rather than a geometry problem. Roof and floor constraint are major factors in pillar design and strength.

  5. Investigation of plant control strategies for the supercritical C0{sub 2}Brayton cycle for a sodium-cooled fast reactor using the plant dynamics code.

    SciTech Connect

    Moisseytsev, A.; Sienicki, J.

    2011-04-12

    The development of a control strategy for the supercritical CO{sub 2} (S-CO{sub 2}) Brayton cycle has been extended to the investigation of alternate control strategies for a Sodium-Cooled Fast Reactor (SFR) nuclear power plant incorporating a S-CO{sub 2} Brayton cycle power converter. The SFR assumed is the 400 MWe (1000 MWt) ABR-1000 preconceptual design incorporating metallic fuel. Three alternative idealized schemes for controlling the reactor side of the plant in combination with the existing automatic control strategy for the S-CO{sub 2} Brayton cycle are explored using the ANL Plant Dynamics Code together with the SAS4A/SASSYS-1 Liquid Metal Reactor (LMR) Analysis Code System coupled together using the iterative coupling formulation previously developed and implemented into the Plant Dynamics Code. The first option assumes that the reactor side can be ideally controlled through movement of control rods and changing the speeds of both the primary and intermediate coolant system sodium pumps such that the intermediate sodium flow rate and inlet temperature to the sodium-to-CO{sub 2} heat exchanger (RHX) remain unvarying while the intermediate sodium outlet temperature changes as the load demand from the electric grid changes and the S-CO{sub 2} cycle conditions adjust according to the S-CO{sub 2} cycle control strategy. For this option, the reactor plant follows an assumed change in load demand from 100 to 0 % nominal at 5 % reduction per minute in a suitable fashion. The second option allows the reactor core power and primary and intermediate coolant system sodium pump flow rates to change autonomously in response to the strong reactivity feedbacks of the metallic fueled core and assumed constant pump torques representing unchanging output from the pump electric motors. The plant behavior to the assumed load demand reduction is surprising close to that calculated for the first option. The only negative result observed is a slight increase in the intermediate inlet sodium temperatures by about 10 C. This temperature rise could presumably be precluded or significantly reduced through fine adjustment of the control rods and pump motors. The third option assumes that the reactor core power and primary and intermediate system flow rates are ideally reduced linearly in a programmed fashion that instantaneously matches the prescribed load demand. The calculated behavior of this idealized case reveals a number of difficulties because the control strategy for the S-CO{sub 2} cycle overcools the reactor potentially resulting in the calculation of sodium bulk freezing and the onset of sodium boiling. The results show that autonomous SFR operation may be viable for the particular assumed load change transient and deserves further investigation for other transients and postulated accidents.

  6. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Pollara, Fabrizio; Hamkins, Jon; Dolinar, Sam; Andrews, Ken; Divsalar, Dariush

    2006-01-01

    This viewgraph presentation reviews uplink coding. The purpose and goals of the briefing are (1) Show a plan for using uplink coding and describe benefits (2) Define possible solutions and their applicability to different types of uplink, including emergency uplink (3) Concur with our conclusions so we can embark on a plan to use proposed uplink system (4) Identify the need for the development of appropriate technology and infusion in the DSN (5) Gain advocacy to implement uplink coding in flight projects Action Item EMB04-1-14 -- Show a plan for using uplink coding, including showing where it is useful or not (include discussion of emergency uplink coding).

  7. An investigative study of multispectral data compression for remotely-sensed images using vector quantization and difference-mapped shift-coding

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    A study is conducted to investigate the effects and advantages of data compression techniques on multispectral imagery data acquired by NASA's airborne scanners at the Stennis Space Center. The first technique used was vector quantization. The vector is defined in the multispectral imagery context as an array of pixels from the same location from each channel. The error obtained in substituting the reconstructed images for the original set is compared for different compression ratios. Also, the eigenvalues of the covariance matrix obtained from the reconstructed data set are compared with the eigenvalues of the original set. The effects of varying the size of the vector codebook on the quality of the compression and on subsequent classification are also presented. The output data from the Vector Quantization algorithm was further compressed by a lossless technique called Difference-mapped Shift-extended Huffman coding. The overall compression for 7 channels of data acquired by the Calibrated Airborne Multispectral Scanner (CAMS), with an RMS error of 15.8 pixels was 195:1 (0.41 bpp) and with an RMS error of 3.6 pixels was 18:1 (.447 bpp). The algorithms were implemented in software and interfaced with the help of dedicated image processing boards to an 80386 PC compatible computer. Modules were developed for the task of image compression and image analysis. Also, supporting software to perform image processing for visual display and interpretation of the compressed/classified images was developed.

  8. An investigation of the potential for the use of a high resolution adaptive coded aperture system in the mid-wave infrared

    NASA Astrophysics Data System (ADS)

    Slinger, Chris; Eismann, Michael; Gordon, Neil; Lewis, Keith; McDonald, Gregor; McNie, Mark; Payne, Doug; Ridley, Kevin; Strens, Malcolm; De Villiers, Geoff; Wilson, Rebecca

    2007-09-01

    Previous applications of coded aperture imaging (CAI) have been mainly in the energetic parts of the electro-magnetic spectrum, such as gamma ray astronomy, where few viable imaging alternatives exist. In addition, resolution requirements have typically been low (~ mrad). This paper investigates the prospects for and advantages of using CAI at longer wavelengths (visible, infrared) and at higher resolutions, and also considers the benefits of adaptive CAI techniques. The latter enable CAI to achieve reconfigurable modes of imaging, as well as improving system performance in other ways, such as enhanced image quality. It is shown that adaptive CAI has several potential advantages over more traditional optical systems for some applications in these wavebands. The merits include low mass, volume and moments of inertia, potentially lower costs, graceful failure modes, steerable fields of regard with no macroscopic moving parts and inherently encrypted data streams. Among the challenges associated with this new imaging approach are the effects of diffraction, interference, photon absorption at the mask and the low scene contrasts in the infrared wavebands. The paper analyzes some of these and presents the results of some of the tradeoffs in optical performance, using radiometric calculations to illustrate the consequences in a mid-infrared application. A CAI system requires a decoding algorithm in order to form an image and the paper discusses novel approaches, tailored to longer wavelength operation. The paper concludes by presenting initial experimental results.

  9. Investigation of the thermal response of a gasdynamic heater with helical impellers. Calspan Report No. 6961-A-1. [MAZE and TACO2D codes

    SciTech Connect

    Rae, W. J.

    1981-12-01

    A gasdynamic heater, capable of producing contamination-free gas streams at temperatures up to 9000/sup 0/K, is being developed by the Vulcan project. The design of a cooling system for the case parts and the associated thermal analysis are a critical part of a successful design. The purpose of the present work was to perform a preliminary cooling passage design and complete thermal analysis for the center body liner, end plate liners and exit nozzle. The approach envisioned for this work was the use of a set of LLNL finite-element codes, called MAZE and TACO2D. These were to be used at LLNL in a series of visits by the Calspan principal investigator. The project was cancelled shortly after the first of these visits; this report contains a summary of the work accomplished during the abbreviated contract period, and a review of the items that will need to be considered when the work is resumed at some future date.

  10. Comet assay in reconstructed 3D human epidermal skin models—investigation of intra- and inter-laboratory reproducibility with coded chemicals

    PubMed Central

    Pfuhler, Stefan

    2013-01-01

    Reconstructed 3D human epidermal skin models are being used increasingly for safety testing of chemicals. Based on EpiDerm™ tissues, an assay was developed in which the tissues were topically exposed to test chemicals for 3h followed by cell isolation and assessment of DNA damage using the comet assay. Inter-laboratory reproducibility of the 3D skin comet assay was initially demonstrated using two model genotoxic carcinogens, methyl methane sulfonate (MMS) and 4-nitroquinoline-n-oxide, and the results showed good concordance among three different laboratories and with in vivo data. In Phase 2 of the project, intra- and inter-laboratory reproducibility was investigated with five coded compounds with different genotoxicity liability tested at three different laboratories. For the genotoxic carcinogens MMS and N-ethyl-N-nitrosourea, all laboratories reported a dose-related and statistically significant increase (P < 0.05) in DNA damage in every experiment. For the genotoxic carcinogen, 2,4-diaminotoluene, the overall result from all laboratories showed a smaller, but significant genotoxic response (P < 0.05). For cyclohexanone (CHN) (non-genotoxic in vitro and in vivo, and non-carcinogenic), an increase compared to the solvent control acetone was observed only in one laboratory. However, the response was not dose related and CHN was judged negative overall, as was p-nitrophenol (p-NP) (genotoxic in vitro but not in vivo and non-carcinogenic), which was the only compound showing clear cytotoxic effects. For p-NP, significant DNA damage generally occurred only at doses that were substantially cytotoxic (>30% cell loss), and the overall response was comparable in all laboratories despite some differences in doses tested. The results of the collaborative study for the coded compounds were generally reproducible among the laboratories involved and intra-laboratory reproducibility was also good. These data indicate that the comet assay in EpiDerm™ skin models is a promising model for the safety assessment of compounds with a dermal route of exposure. PMID:24150594

  11. Material-dependent and material-independent selection processes in the frontal and parietal lobes: an event-related fMRI investigation of response competition

    NASA Technical Reports Server (NTRS)

    Hazeltine, Eliot; Bunge, Silvia A.; Scanlon, Michael D.; Gabrieli, John D E.

    2003-01-01

    The present study used the flanker task [Percept. Psychophys. 16 (1974) 143] to identify neural structures that support response selection processes, and to determine which of these structures respond differently depending on the type of stimulus material associated with the response. Participants performed two versions of the flanker task while undergoing event-related functional magnetic resonance imaging (fMRI). Both versions of the task required participants to respond to a central stimulus regardless of the responses associated with simultaneously presented flanking stimuli, but one used colored circle stimuli and the other used letter stimuli. Competition-related activation was identified by comparing Incongruent trials, in which the flanker stimuli indicated a different response than the central stimulus, to Neutral stimuli, in which the flanker stimuli indicated no response. A region within the right inferior frontal gyrus exhibited significantly more competition-related activation for the color stimuli, whereas regions within the middle frontal gyri of both hemispheres exhibited more competition-related activation for the letter stimuli. The border of the right middle frontal and inferior frontal gyri and the anterior cingulate cortex (ACC) were significantly activated by competition for both types of stimulus materials. Posterior foci demonstrated a similar pattern: left inferior parietal cortex showed greater competition-related activation for the letters, whereas right parietal cortex was significantly activated by competition for both materials. These findings indicate that the resolution of response competition invokes both material-dependent and material-independent processes.

  12. An Evaluation of Two Different Methods of Assessing Independent Investigations in an Operational Pre-University Level Examination in Biology in England.

    ERIC Educational Resources Information Center

    Brown, Chris

    1998-01-01

    Explored aspects of assessment of extended investigation ("project") practiced in the operational examinations of The University of Cambridge Local Examinations Syndicate (UCLES) for the perspective of construct validity. Samples of the 1993 (n=333) and 1996 (n=259) biology test results reveal two methods of assessing the project. (MAK)

  13. Is ADHD a Risk Factor Independent of Conduct Disorder for Illicit Substance Use? A Meta-Analysis and Meta-Regression Investigation

    ERIC Educational Resources Information Center

    Serra-Pinheiro, Maria Antonia; Coutinho, Evandro S. F.; Souza, Isabella S.; Pinna, Camilla; Fortes, Didia; Araujo, Catia; Szobot, Claudia M.; Rohde, Luis A.; Mattos, Paulo

    2013-01-01

    Objective: To investigate meta-analytically if the association between ADHD and illicit substance use (ISU) is maintained when controlling for conduct disorder/oppositional-defiant disorder (CD/ODD). Method: A systematic literature review was conducted through Medline from 1980 to 2008. Data extracted and selections made by one author were…

  14. Is ADHD a Risk Factor Independent of Conduct Disorder for Illicit Substance Use? A Meta-Analysis and Meta-Regression Investigation

    ERIC Educational Resources Information Center

    Serra-Pinheiro, Maria Antonia; Coutinho, Evandro S. F.; Souza, Isabella S.; Pinna, Camilla; Fortes, Didia; Araujo, Catia; Szobot, Claudia M.; Rohde, Luis A.; Mattos, Paulo

    2013-01-01

    Objective: To investigate meta-analytically if the association between ADHD and illicit substance use (ISU) is maintained when controlling for conduct disorder/oppositional-defiant disorder (CD/ODD). Method: A systematic literature review was conducted through Medline from 1980 to 2008. Data extracted and selections made by one author were

  15. A Dynamic Population Model to Investigate Effects of Climate and Climate-Independent Factors on the Lifecycle of Amblyomma americanum (Acari: Ixodidae).

    PubMed

    Ludwig, Antoinette; Ginsberg, Howard S; Hickling, Graham J; Ogden, Nicholas H

    2016-01-01

    The lone star tick, Amblyomma americanum, is a disease vector of significance for human and animal health throughout much of the eastern United States. To model the potential effects of climate change on this tick, a better understanding is needed of the relative roles of temperature-dependent and temperature-independent (day-length-dependent behavioral or morphogenetic diapause) processes acting on the tick lifecycle. In this study, we explored the roles of these processes by simulating seasonal activity patterns using models with site-specific temperature and day-length-dependent processes. We first modeled the transitions from engorged larvae to feeding nymphs, engorged nymphs to feeding adults, and engorged adult females to feeding larvae. The simulated seasonal patterns were compared against field observations at three locations in United States. Simulations suggested that 1) during the larva-to-nymph transition, some larvae undergo no diapause while others undergo morphogenetic diapause of engorged larvae; 2) molted adults undergo behavioral diapause during the transition from nymph-to-adult; and 3) there is no diapause during the adult-to-larva transition. A model constructed to simulate the full lifecycle of A. americanum successfully predicted observed tick activity at the three U.S. study locations. Some differences between observed and simulated seasonality patterns were observed, however, identifying the need for research to refine some model parameters. In simulations run using temperature data for Montreal, deterministic die-out of A. americanum populations did not occur, suggesting the possibility that current climate in parts of southern Canada is suitable for survival and reproduction of this tick. PMID:26502753

  16. A dynamic population model to investigate effects of climate and climate-independent factors on the lifecycle of the tick Amblyomma americanum

    USGS Publications Warehouse

    Ludwig, Antoinette; Ginsberg, Howard; Hickling, Graham J.; Ogden, Nicholas H.

    2015-01-01

    The lone star tick, Amblyomma americanum, is a disease vector of significance for human and animal health throughout much of the eastern United States. To model the potential effects of climate change on this tick, a better understanding is needed of the relative roles of temperature-dependent and temperature-independent (day-length-dependent behavioral or morphogenetic diapause) processes acting on the tick lifecycle. In this study, we explored the roles of these processes by simulating seasonal activity patterns using models with site-specific temperature and day-length-dependent processes. We first modeled the transitions from engorged larvae to feeding nymphs, engorged nymphs to feeding adults, and engorged adult females to feeding larvae. The simulated seasonal patterns were compared against field observations at three locations in United States. Simulations suggested that 1) during the larva-to-nymph transition, some larvae undergo no diapause while others undergo morphogenetic diapause of engorged larvae; 2) molted adults undergo behavioral diapause during the transition from nymph-to-adult; and 3) there is no diapause during the adult-to-larva transition. A model constructed to simulate the full lifecycle of A. americanum successfully predicted observed tick activity at the three U.S. study locations. Some differences between observed and simulated seasonality patterns were observed, however, identifying the need for research to refine some model parameters. In simulations run using temperature data for Montreal, deterministic die-out of A. americanum populations did not occur, suggesting the possibility that current climate in parts of southern Canada is suitable for survival and reproduction of this tick.

  17. Phonological coding during reading

    PubMed Central

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  18. Phonological coding during reading.

    PubMed

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. PMID:25150679

  19. How do we code the letters of a word when we have to write it? Investigating double letter representation in French.

    PubMed

    Kandel, Sonia; Peereman, Ronald; Ghimenton, Anna

    2014-05-01

    How do we code the letters of a word when we have to write it? We examined whether the orthographic representations that the writing system activates have a specific coding for letters when these are doubled in a word. French participants wrote words on a digitizer. The word pairs shared the initial letters and differed on the presence of a double letter (e.g., LISSER/LISTER). The results on latencies, letter and inter-letter interval durations revealed that L and I are slower to write when followed by a doublet (SS) than when not (ST). Doublet processing constitutes a supplementary cognitive load that delays word production. This suggests that word representations code letter identity and quantity separately. The data also revealed that the central processes that are involved in spelling representation cascade into the peripheral processes that regulate movement execution. PMID:24486807

  20. Functional Investigation of a Non-coding Variant Associated with Adolescent Idiopathic Scoliosis in Zebrafish: Elevated Expression of the Ladybird Homeobox Gene Causes Body Axis Deformation

    PubMed Central

    Guo, Long; Yamashita, Hiroshi; Kou, Ikuyo; Takimoto, Aki; Meguro-Horike, Makiko; Horike, Shin-ichi; Sakuma, Tetsushi; Miura, Shigenori; Adachi, Taiji; Yamamoto, Takashi; Ikegawa, Shiro; Hiraki, Yuji; Shukunami, Chisa

    2016-01-01

    Previously, we identified an adolescent idiopathic scoliosis susceptibility locus near human ladybird homeobox 1 (LBX1) and FLJ41350 by a genome-wide association study. Here, we characterized the associated non-coding variant and investigated the function of these genes. A chromosome conformation capture assay revealed that the genome region with the most significantly associated single nucleotide polymorphism (rs11190870) physically interacted with the promoter region of LBX1-FLJ41350. The promoter in the direction of LBX1, combined with a 590-bp region including rs11190870, had higher transcriptional activity with the risk allele than that with the non-risk allele in HEK 293T cells. The ubiquitous overexpression of human LBX1 or either of the zebrafish lbx genes (lbx1a, lbx1b, and lbx2), but not FLJ41350, in zebrafish embryos caused body curvature followed by death prior to vertebral column formation. Such body axis deformation was not observed in transcription activator-like effector nucleases mediated knockout zebrafish of lbx1b or lbx2. Mosaic expression of lbx1b driven by the GATA2 minimal promoter and the lbx1b enhancer in zebrafish significantly alleviated the embryonic lethal phenotype to allow observation of the later onset of the spinal curvature with or without vertebral malformation. Deformation of the embryonic body axis by lbx1b overexpression was associated with defects in convergent extension, which is a component of the main axis-elongation machinery in gastrulating embryos. In embryos overexpressing lbx1b, wnt5b, a ligand of the non-canonical Wnt/planar cell polarity (PCP) pathway, was significantly downregulated. Injection of mRNA for wnt5b or RhoA, a key downstream effector of Wnt/PCP signaling, rescued the defective convergent extension phenotype and attenuated the lbx1b-induced curvature of the body axis. Thus, our study presents a novel pathological feature of LBX1 and its zebrafish homologs in body axis deformation at various stages of embryonic and subsequent growth in zebrafish. PMID:26820155

  1. Functional Investigation of a Non-coding Variant Associated with Adolescent Idiopathic Scoliosis in Zebrafish: Elevated Expression of the Ladybird Homeobox Gene Causes Body Axis Deformation.

    PubMed

    Guo, Long; Yamashita, Hiroshi; Kou, Ikuyo; Takimoto, Aki; Meguro-Horike, Makiko; Horike, Shin-Ichi; Sakuma, Tetsushi; Miura, Shigenori; Adachi, Taiji; Yamamoto, Takashi; Ikegawa, Shiro; Hiraki, Yuji; Shukunami, Chisa

    2016-01-01

    Previously, we identified an adolescent idiopathic scoliosis susceptibility locus near human ladybird homeobox 1 (LBX1) and FLJ41350 by a genome-wide association study. Here, we characterized the associated non-coding variant and investigated the function of these genes. A chromosome conformation capture assay revealed that the genome region with the most significantly associated single nucleotide polymorphism (rs11190870) physically interacted with the promoter region of LBX1-FLJ41350. The promoter in the direction of LBX1, combined with a 590-bp region including rs11190870, had higher transcriptional activity with the risk allele than that with the non-risk allele in HEK 293T cells. The ubiquitous overexpression of human LBX1 or either of the zebrafish lbx genes (lbx1a, lbx1b, and lbx2), but not FLJ41350, in zebrafish embryos caused body curvature followed by death prior to vertebral column formation. Such body axis deformation was not observed in transcription activator-like effector nucleases mediated knockout zebrafish of lbx1b or lbx2. Mosaic expression of lbx1b driven by the GATA2 minimal promoter and the lbx1b enhancer in zebrafish significantly alleviated the embryonic lethal phenotype to allow observation of the later onset of the spinal curvature with or without vertebral malformation. Deformation of the embryonic body axis by lbx1b overexpression was associated with defects in convergent extension, which is a component of the main axis-elongation machinery in gastrulating embryos. In embryos overexpressing lbx1b, wnt5b, a ligand of the non-canonical Wnt/planar cell polarity (PCP) pathway, was significantly downregulated. Injection of mRNA for wnt5b or RhoA, a key downstream effector of Wnt/PCP signaling, rescued the defective convergent extension phenotype and attenuated the lbx1b-induced curvature of the body axis. Thus, our study presents a novel pathological feature of LBX1 and its zebrafish homologs in body axis deformation at various stages of embryonic and subsequent growth in zebrafish. PMID:26820155

  2. Image coding.

    PubMed

    Kunt, M

    1988-01-01

    The digital representation of an image requires a very large number of bits. The goal of image coding is to reduce this number, as much as possible, and reconstruct a faithful duplicate of the original picture. Early efforts in image coding, solely guided by information theory, led to a plethora of methods. The compression ratio reached a saturation level around 10:1 a couple of years ago. Recent progress in the study of the brain mechanism of vision and scene analysis has opened new vistas in picture coding. Directional sensitivity of the neurones in the visual pathway combined with the separate processing of contours and textures has led to a new class of coding methods capable of achieving compression ratios as high as 100:1. PMID:3072645

  3. Getting Students to be Successful, Independent Investigators

    ERIC Educational Resources Information Center

    Thomas, Jeffrey D.

    2010-01-01

    Middle school students often struggle when writing testable problems, planning valid and reliable procedures, and drawing meaningful evidence-based conclusions. To address this issue, the author created a student-centered lab handout to facilitate the inquiry process for students. This handout has reduced students' frustration and helped them…

  4. Independent Peer Reviews

    SciTech Connect

    2012-03-16

    Independent Assessments: DOE's Systems Integrator convenes independent technical reviews to gauge progress toward meeting specific technical targets and to provide technical information necessary for key decisions.

  5. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  6. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  7. Subcortical neural synchrony and absolute thresholds predict frequency discrimination independently.

    PubMed

    Marmel, F; Linley, D; Carlyon, R P; Gockel, H E; Hopkins, K; Plack, C J

    2013-10-01

    The neural mechanisms of pitch coding have been debated for more than a century. The two main mechanisms are coding based on the profiles of neural firing rates across auditory nerve fibers with different characteristic frequencies (place-rate coding), and coding based on the phase-locked temporal pattern of neural firing (temporal coding). Phase locking precision can be partly assessed by recording the frequency-following response (FFR), a scalp-recorded electrophysiological response that reflects synchronous activity in subcortical neurons. Although features of the FFR have been widely used as indices of pitch coding acuity, only a handful of studies have directly investigated the relation between the FFR and behavioral pitch judgments. Furthermore, the contribution of degraded neural synchrony (as indexed by the FFR) to the pitch perception impairments of older listeners and those with hearing loss is not well known. Here, the relation between the FFR and pure-tone frequency discrimination was investigated in listeners with a wide range of ages and absolute thresholds, to assess the respective contributions of subcortical neural synchrony and other age-related and hearing loss-related mechanisms to frequency discrimination performance. FFR measures of neural synchrony and absolute thresholds independently contributed to frequency discrimination performance. Age alone, i.e., once the effect of subcortical neural synchrony measures or absolute thresholds had been partialed out, did not contribute to frequency discrimination. Overall, the results suggest that frequency discrimination of pure tones may depend both on phase locking precision and on separate mechanisms affected in hearing loss. PMID:23760984

  8. Constructions for finite-state codes

    NASA Technical Reports Server (NTRS)

    Pollara, F.; Mceliece, R. J.; Abdel-Ghaffar, K.

    1987-01-01

    A class of codes called finite-state (FS) codes is defined and investigated. These codes, which generalize both block and convolutional codes, are defined by their encoders, which are finite-state machines with parallel inputs and outputs. A family of upper bounds on the free distance of a given FS code is derived from known upper bounds on the minimum distance of block codes. A general construction for FS codes is then given, based on the idea of partitioning a given linear block into cosets of one of its subcodes, and it is shown that in many cases the FS codes constructed in this way have a d sub free which is as large as possible. These codes are found without the need for lengthy computer searches, and have potential applications for future deep-space coding systems. The issue of catastropic error propagation (CEP) for FS codes is also investigated.

  9. Covariance Matrix Evaluations for Independent Mass Fission Yields

    SciTech Connect

    Terranova, N.; Serot, O.; Archier, P.; De Saint Jean, C.

    2015-01-15

    Recent needs for more accurate fission product yields include covariance information to allow improved uncertainty estimations of the parameters used by design codes. The aim of this work is to investigate the possibility to generate more reliable and complete uncertainty information on independent mass fission yields. Mass yields covariances are estimated through a convolution between the multi-Gaussian empirical model based on Brosa's fission modes, which describe the pre-neutron mass yields, and the average prompt neutron multiplicity curve. The covariance generation task has been approached using the Bayesian generalized least squared method through the CONRAD code. Preliminary results on mass yields variance-covariance matrix will be presented and discussed from physical grounds in the case of {sup 235}U(n{sub th}, f) and {sup 239}Pu(n{sub th}, f) reactions.

  10. Covariance Matrix Evaluations for Independent Mass Fission Yields

    NASA Astrophysics Data System (ADS)

    Terranova, N.; Serot, O.; Archier, P.; De Saint Jean, C.; Sumini, M.

    2015-01-01

    Recent needs for more accurate fission product yields include covariance information to allow improved uncertainty estimations of the parameters used by design codes. The aim of this work is to investigate the possibility to generate more reliable and complete uncertainty information on independent mass fission yields. Mass yields covariances are estimated through a convolution between the multi-Gaussian empirical model based on Brosa's fission modes, which describe the pre-neutron mass yields, and the average prompt neutron multiplicity curve. The covariance generation task has been approached using the Bayesian generalized least squared method through the CONRAD code. Preliminary results on mass yields variance-covariance matrix will be presented and discussed from physical grounds in the case of 235U(nth, f) and 239Pu(nth, f) reactions.

  11. Experimental investigation of neutronic characteristics of the IR-8 reactor to confirm the results of calculations by MCU-PTR code

    NASA Astrophysics Data System (ADS)

    Surkov, A. V.; Kochkin, V. N.; Pesnya, Yu. E.; Nasonov, V. A.; Vihrov, V. I.; Erak, D. Yu.

    2015-12-01

    A comparison of measured and calculated neutronic characteristics (fast neutron flux and fission rate of 235U) in the core and reflector of the IR-8 reactor is presented. The irradiation devices equipped with neutron activation detectors were prepared. The determination of fast neutron flux was performed using the 54Fe ( n, p) and 58Ni ( n, p) reactions. The 235U fission rate was measured using uranium dioxide with 10% enrichment in 235U. The determination of specific activities of detectors was carried out by measuring the intensity of characteristic gamma peaks using the ORTEC gamma spectrometer. Neutron fields in the core and reflector of the IR-8 reactor were calculated using the MCU-PTR code.

  12. Experimental investigation of neutronic characteristics of the IR-8 reactor to confirm the results of calculations by MCU-PTR code

    SciTech Connect

    Surkov, A. V. Kochkin, V. N.; Pesnya, Yu. E.; Nasonov, V. A.; Vihrov, V. I.; Erak, D. Yu.

    2015-12-15

    A comparison of measured and calculated neutronic characteristics (fast neutron flux and fission rate of {sup 235}U) in the core and reflector of the IR-8 reactor is presented. The irradiation devices equipped with neutron activation detectors were prepared. The determination of fast neutron flux was performed using the {sup 54}Fe (n, p) and {sup 58}Ni (n, p) reactions. The {sup 235}U fission rate was measured using uranium dioxide with 10% enrichment in {sup 235}U. The determination of specific activities of detectors was carried out by measuring the intensity of characteristic gamma peaks using the ORTEC gamma spectrometer. Neutron fields in the core and reflector of the IR-8 reactor were calculated using the MCU-PTR code.

  13. Agent-independent planning

    NASA Technical Reports Server (NTRS)

    Davis, William S.

    1990-01-01

    Viewgraphs and discussion on agent-independent planning are presented. Topics covered include: definitions; Space Station Freedom robotics environment; transition from crewmember to robots; agent-independent planning system flow; independence between plans and agents; existing testbed; benefits of approach; and directions of future research.

  14. Independent Study in Iowa.

    ERIC Educational Resources Information Center

    Idaho Univ., Moscow.

    This guide to independent study in Idaho begins with introductory information on the following aspects of independent study: the Independent Study in Idaho consortium, student eligibility, special needs, starting dates, registration, costs, textbooks and instructional materials, e-mail and faxing, refunds, choosing a course, time limits, speed…

  15. Nevada Nuclear Waste Storage Investigations Project: Unit evaluation at Yucca Mountain, Nevada Test Site: Near-field thermal and mechanical calculations using the SANDIA-ADINA code

    SciTech Connect

    Johnson, R.L.; Bauer, S.J.

    1987-05-01

    Presented in this report are the results of a comparative study of two candidate horizons, the welded, devitrified Topopah Spring Member ofthe Paintbrush Tuff, and the nonwelded, zeolitized Tuffaceous Beds of Calico Hills. The mechanical and thermomechanical response these two horizons was assessed by conducting thermal and thermomechanical calculations using a two-dimensional room and pillar geometry of the vertical waste emplacement option using average and limit properties for each. A modified version of the computer code ADINA (SANDIA-ADINA) containing a material model for rock masses with ubiquitous jointing was used in the calculations. Results of the calculations are presented as the units` capacity for storage of nuclear waste and stability of the emplacement room and pillar due to excavation and long-term heating. A comparison is made with a similar underground opening geometry sited in Grouse Canyon Tuff, using properties obtained from G-Tunnel - a horizon of known excavation characteristics. Long-term stability of the excavated rooms was predicted for all units, as determined by evaluating regions of predicted joint slip as the result of excavation and subsequent thermal loading, evaluating regions of predicted rock matrix failure as the result of excavation and subsequent thermal loading, and evaluating safety factors against rock matrix failure. These results were derived through considering a wide range in material properties and in situ stresses. 21 refs., 21 figs., 5 tabs.

  16. Film Festivals: A First Step for Independents.

    ERIC Educational Resources Information Center

    Manning, Nick

    In order for filmmaking to be a true art form, the filmmaker needs to be free both to conceive and realize a personal vision and to remain independent of rating codes, length prescriptions, the market, sterile formats, and other imposed limitations. Moreover, if noncommercial films are to succeed in the next decade, a respectful audience must be…

  17. Multiple wavelet-tree-based image coding and robust transmission

    NASA Astrophysics Data System (ADS)

    Cao, Lei; Chen, Chang Wen

    2004-10-01

    In this paper, we present techniques based on multiple wavelet-tree coding for robust image transmission. The algorithm of set partitioning in hierarchical trees (SPIHT) is a state-of-the-art technique for image compression. This variable length coding (VLC) technique, however, is extremely sensitive to channel errors. To improve the error resilience capability and in the meantime to keep the high source coding efficiency through VLC, we propose to encode each wavelet tree or a group of wavelet trees using SPIHT algorithm independently. Instead of encoding the entire image as one bitstream, multiple bitstreams are generated. Therefore, error propagation is limited within individual bitstream. Two methods based on subsampling and human visual sensitivity are proposed to group the wavelet trees. The multiple bitstreams are further protected by the rate compatible puncture convolutional (RCPC) codes. Unequal error protection are provided for both different bitstreams and different bit segments inside each bitstream. We also investigate the improvement of error resilience through error resilient entropy coding (EREC) and wavelet tree coding when channels are slightly corruptive. A simple post-processing technique is also proposed to alleviate the effect of residual errors. We demonstrate through simulations that systems with these techniques can achieve much better performance than systems transmitting a single bitstream in noisy environments.

  18. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  19. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content

  20. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…

  1. Implementation issues in source coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.

    1989-01-01

    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

  2. Role of long non-coding RNA HULC in cell proliferation, apoptosis and tumor metastasis of gastric cancer: a clinical and in vitro investigation.

    PubMed

    Zhao, Yan; Guo, Qinhao; Chen, Jiejing; Hu, Jun; Wang, Shuwei; Sun, Yueming

    2014-01-01

    Long non-coding RNAs (lncRNAs) are emerging as key molecules in human cancer. Highly upregulated in liver cancer (HULC), an lncRNA, has recently been revealed to be involved in hepatocellular carcinoma development and progression. It remains unclear, however, whether HULC plays an oncogenic role in human gastric cancer (GC). In the present study, we demonstrated that HULC was significantly overexpressed in GC cell lines and GC tissues compared with normal controls, and this overexpression was correlated with lymph node metastasis, distant metastasis and advanced tumor node metastasis stages. In addition, a receiver operating characteristic (ROC) curve was constructed to evaluate the diagnostic values and the area under the ROC curve of HULC was up to 0.769. To uncover its functional importance, gain- and loss-of-function studies were performed to evaluate the effect of HULC on cell proliferation, apoptosis and invasion in vitro. Overexpression of HULC promoted proliferation and invasion and inhibited cell apoptosis in SGC7901 cells, while knockdown of HULC in SGC7901 cells showed the opposite effect. Mechanistically, we discovered that overexpression of HULC could induce patterns of autophagy in SGC7901 cells; more importantly, autophagy inhibition increased overexpression of HULC cell apoptosis. We also determined that silencing of HULC effectively reversed the epithelial-to-mesenchymal transition (EMT) phenotype. In summary, our results suggest that HULC may play an important role in the growth and tumorigenesis of human GC, which provides us with a new biomarker in GC and perhaps a potential target for GC prevention, diagnosis and therapeutic treatment. PMID:24247585

  3. An Approach to Keeping Independent Colleges Independent.

    ERIC Educational Resources Information Center

    Northwest Area Foundation, St. Paul, Minn.

    As a result of the financial difficulties faced by independent colleges in the northwestern United States, the Northwest Area Foundation in 1972 surveyed the administrations of 80 private colleges to get a profile of the colleges, a list of their current problems, and some indication of how the problems might be approached. The three top problems…

  4. The Integrated TIGER Series Codes

    SciTech Connect

    Kensek, Ronald P.; Franke, Brian C.; Laub, Thomas W.

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  5. The Integrated TIGER Series Codes

    Energy Science and Technology Software Center (ESTSC)

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with anmore » input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.« less

  6. CONTAIN independent peer review

    SciTech Connect

    Boyack, B.E.; Corradini, M.L.; Denning, R.S.; Khatib-Rahbar, M.; Loyalka, S.K.; Smith, P.N.

    1995-01-01

    The CONTAIN code was developed by Sandia National Laboratories under the sponsorship of the US Nuclear Regulatory Commission (NRC) to provide integrated analyses of containment phenomena. It is used to predict nuclear reactor containment loads, radiological source terms, and associated physical phenomena for a range of accident conditions encompassing both design-basis and severe accidents. The code`s targeted applications include support for containment-related experimental programs, light water and advanced light water reactor plant analysis, and analytical support for resolution of specific technical issues such as direct containment heating. The NRC decided that a broad technical review of the code should be performed by technical experts to determine its overall technical adequacy. For this purpose, a six-member CONTAIN Peer Review Committee was organized and a peer review as conducted. While the review was in progress, the NRC issued a draft ``Revised Severe Accident Code Strategy`` that incorporated revised design objectives and targeted applications for the CONTAIN code. The committee continued its effort to develop findings relative to the original NRC statement of design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications were considered by the Committee in assigning priorities to the Committee`s recommendations. The Committee determined some improvements are warranted and provided recommendations in five code-related areas: (1) documentation, (2) user guidance, (3) modeling capability, (4) code assessment, and (5) technical assessment.

  7. A preliminary investigation of Large Eddy Simulation (LES) of the flow around a cylinder at ReD = 3900 using a commercial CFD code

    SciTech Connect

    Paschkewitz, J S

    2006-02-14

    Engineering fluid mechanics simulations at high Reynolds numbers have traditionally been performed using the Reynolds-Averaged Navier Stokes (RANS) equations and a turbulence model. The RANS methodology has well-documented shortcomings in the modeling of separated or bluff body wake flows that are characterized by unsteady vortex shedding. The resulting turbulence statistics are strongly influenced by the detailed structure and dynamics of the large eddies, which are poorly captured using RANS models (Rodi 1997; Krishnan et al. 2004). The Large Eddy Simulation (LES) methodology offers the potential to more accurately simulate these flows as it resolves the large-scale unsteady motions and entails modeling of only the smallest-scale turbulence structures. Commercial computational fluid dynamics products are beginning to offer LES capability, allowing practicing engineers an opportunity to apply this turbulence modeling technique to much wider array of problems than in dedicated research codes. Here, we present a preliminary evaluation of the LES capability in the commercial CFD solver StarCD by simulating the flow around a cylinder at a Reynolds number based on the cylinder diameter, D, of 3900 using the constant coefficient Smagorinsky LES model. The results are compared to both the experimental and computational results provided in Kravchenko & Moin (2000). We find that StarCD provides predictions of lift and drag coefficients that are within 15% of the experimental values. Reasonable agreement is obtained between the time-averaged velocity statistics and the published data. The differences in these metrics may be due to the use of a truncated domain in the spanwise direction and the short time-averaging period used for the statistics presented here. The instantaneous flow field visualizations show a coarser, larger-scale structure than the study of Kravchenko & Moin (2000), which may be a product of the LES implementation or of the domain and resolution used. Based on this preliminary study, we conclude that StarCD's LES implementation may useful for low Reynolds number LES computations if proper care is used in the problem and mesh definition.

  8. Codes with special correlation.

    NASA Technical Reports Server (NTRS)

    Baumert, L. D.

    1964-01-01

    Uniform binary codes with special correlation including transorthogonality and simplex code, Hadamard matrices and difference sets uniform binary codes with special correlation including transorthogonality and simplex code, Hadamard matrices and difference sets

  9. Investigating the Influence of the Added Mass Effect to Marine Hydrokinetic Horizontal-Axis Turbines Using a General Dynamic Wake Wind Turbine Code

    SciTech Connect

    Maniaci, D. C.; Li, Y.

    2011-10-01

    This paper describes a recent study to investigate the applicability of a horizontal-axis wind turbine (HAWT) structural dynamics and unsteady aerodynamics analysis program (FAST and AeroDyn respectively) to modeling the forces on marine hydrokinetic (MHK) turbines. This paper summarizes the added mass model that has been added to AeroDyn. The added mass model only includes flow acceleration perpendicular to the rotor disc, and ignores added mass forces caused by blade deflection. A model of the National Renewable Energy Laboratory's (NREL) Unsteady Aerodynamics Experiment (UAE) Phase VI wind turbine was analyzed using FAST and AeroDyn with sea water conditions and the new added mass model. The results of this analysis exhibited a 3.6% change in thrust for a rapid pitch case and a slight change in amplitude and phase of thrust for a case with 30{sup o} of yaw.

  10. Investigating the Influence of the Added Mass Effect to Marine Hydrokinetic Horizontal-Axis Turbines Using a General Dynamic Wake Wind Turbine Code: Preprint

    SciTech Connect

    Maniaci, D. C.; Li, Y.

    2012-04-01

    This paper describes a recent study to investigate the applicability of a horizontal-axis wind turbine (HAWT) structural dynamics and unsteady aerodynamics analysis program (FAST and AeroDyn respectively) to modeling the forces on marine hydrokinetic (MHK) turbines. It summarizes the added mass model that has been added to AeroDyn. The added mass model only includes flow acceleration perpendicular to the rotor disc, and ignores added mass forces caused by blade deflection. A model of the National Renewable Energy Laboratory's (NREL) Unsteady Aerodynamics Experiment (UAE) Phase VI wind turbine was analyzed using FAST and AeroDyn with sea water conditions and the new added mass model. The results of this analysis exhibited a 3.6% change in thrust for a rapid pitch case and a slight change in amplitude and phase of thrust for a case with 30 degrees of yaw.

  11. On the error probability of general tree and trellis codes with applications to sequential decoding

    NASA Technical Reports Server (NTRS)

    Johannesson, R.

    1973-01-01

    An upper bound on the average error probability for maximum-likelihood decoding of the ensemble of random binary tree codes is derived and shown to be independent of the length of the tree. An upper bound on the average error probability for maximum-likelihood decoding of the ensemble of random L-branch binary trellis codes of rate R = 1/n is derived which separates the effects of the tail length T and the memory length M of the code. It is shown that the bound is independent of the length L of the information sequence. This implication is investigated by computer simulations of sequential decoding utilizing the stack algorithm. These simulations confirm the implication and further suggest an empirical formula for the true undetected decoding error probability with sequential decoding.

  12. American Independence. Fifth Grade.

    ERIC Educational Resources Information Center

    Crosby, Annette

    This fifth grade teaching unit covers early conflicts between the American colonies and Britain, battles of the American Revolutionary War, and the Declaration of Independence. Knowledge goals address the pre-revolutionary acts enforced by the British, the concepts of conflict and independence, and the major events and significant people from the…

  13. Background-independence

    NASA Astrophysics Data System (ADS)

    Belot, Gordon

    2011-10-01

    Intuitively speaking, a classical field theory is background-independent if the structure required to make sense of its equations is itself subject to dynamical evolution, rather than being imposed ab initio. The aim of this paper is to provide an explication of this intuitive notion. Background-independence is not a not formal property of theories: the question whether a theory is background-independent depends upon how the theory is interpreted. Under the approach proposed here, a theory is fully background-independent relative to an interpretation if each physical possibility corresponds to a distinct spacetime geometry; and it falls short of full background-independence to the extent that this condition fails.

  14. Production code control system for hydrodynamics simulations

    SciTech Connect

    Slone, D.M.

    1997-08-18

    We describe how the Production Code Control System (pCCS), written in Perl, has been used to control and monitor the execution of a large hydrodynamics simulation code in a production environment. We have been able to integrate new, disparate, and often independent, applications into the PCCS framework without the need to modify any of our existing application codes. Both users and code developers see a consistent interface to the simulation code and associated applications regardless of the physical platform, whether an MPP, SMP, server, or desktop workstation. We will also describe our use of Perl to develop a configuration management system for the simulation code, as well as a code usage database and report generator. We used Perl to write a backplane that allows us plug in preprocessors, the hydrocode, postprocessors, visualization tools, persistent storage requests, and other codes. We need only teach PCCS a minimal amount about any new tool or code to essentially plug it in and make it usable to the hydrocode. PCCS has made it easier to link together disparate codes, since using Perl has removed the need to learn the idiosyncrasies of system or RPC programming. The text handling in Perl makes it easy to teach PCCS about new codes, or changes to existing codes.

  15. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    PubMed

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. PMID:25894105

  16. Performance comparison of aperture codes for multimodal, multiplex spectroscopy.

    PubMed

    Wagadarikar, Ashwin A; Gehm, Michael E; Brady, David J

    2007-08-01

    We experimentally evaluate diverse static independent column codes in a coded aperture spectrometer. The performance of each code is evaluated based on the signal-to-noise ratio (SNR), defined as the peak value in the spectrum to the standard deviation of the background noise, as a function of subpixel vertical misalignments. Among the code families tested, an S-matrix-based code produces spectral reconstructions with the highest SNR. The SNR is least sensitive to vertical subpixel misalignments on the detector with a Hadamard-matrix-based code. Finally, the increased sensitivity of a spectrometer using a coded aperture instead of a slit is demonstrated. PMID:17676097

  17. The Independence of Reduced Subgroup-State

    NASA Astrophysics Data System (ADS)

    Luo, Ming-Xing; Deng, Yun

    2014-09-01

    Quantum hidden problem being one of the most important quantum computation problems has been widely investigated. Our purpose in this paper is to prove the independent or partial independent of the reduced state derived from the quantum query with the oracle implementation. We prove that if without bias on implementation functions the subgroup state is independent of evaluation functions using the group representation. This result is also used to improve the quantum query success probability.

  18. The investigation of tethered satellite system dynamics

    NASA Technical Reports Server (NTRS)

    Lorenzini, E. C.

    1986-01-01

    The analysis of the rotational dynamics of the satellite was focused on the rotational amplitude increase of the satellite, with respect to the tether, during retrieval. The dependence of the rotational amplitude upon the tether tension variation to the power 1/4 was thoroughly investigated. The damping of rotational oscillations achievable by reel control was also quantified while an alternative solution that makes use of a lever arm attached with a universal joint to the satellite was proposed. Comparison simulations between the Smithsonian Astrophysical Observatory and the Martin Marietta (MMA) computer code of reteival maneuvers were also carried out. The agreement between the two, completely independent, codes was extremely close, demonstrating the reliability of the models. The slack tether dynamics during reel jams was analytically investigated in order to identify the limits of applicability of the SLACK3 computer code to this particular case. Test runs with SLACK3 were also carried out.

  19. An Efficient Variable Length Coding Scheme for an IID Source

    NASA Technical Reports Server (NTRS)

    Cheung, K. -M.

    1995-01-01

    A scheme is examined for using two alternating Huffman codes to encode a discrete independent and identically distributed source with a dominant symbol. This combined strategy, or alternating runlength Huffman (ARH) coding, was found to be more efficient than ordinary coding in certain circumstances.

  20. Homological stabilizer codes

    SciTech Connect

    Anderson, Jonas T.

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  1. Media independent interface

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The work done on the Media Independent Interface (MII) Interface Control Document (ICD) program is described and recommendations based on it were made. Explanations and rationale for the content of the ICD itself are presented.

  2. Data Machine Independence

    Energy Science and Technology Software Center (ESTSC)

    1994-12-30

    Data-machine independence achieved by using four technologies (ASN.1, XDR, SDS, and ZEBRA) has been evaluated by encoding two different applications in each of the above; and their results compared against the standard programming method using C.

  3. Energy efficient rateless codes for high speed data transfer over free space optical channels

    NASA Astrophysics Data System (ADS)

    Prakash, Geetha; Kulkarni, Muralidhar; Acharya, U. S.

    2015-03-01

    Terrestrial Free Space Optical (FSO) links transmit information by using the atmosphere (free space) as a medium. In this paper, we have investigated the use of Luby Transform (LT) codes as a means to mitigate the effects of data corruption induced by imperfect channel which usually takes the form of lost or corrupted packets. LT codes, which are a class of Fountain codes, can be used independent of the channel rate and as many code words as required can be generated to recover all the message bits irrespective of the channel performance. Achieving error free high data rates with limited energy resources is possible with FSO systems if error correction codes with minimal overheads on the power can be used. We also employ a combination of Binary Phase Shift Keying (BPSK) with provision for modification of threshold and optimized LT codes with belief propagation for decoding. These techniques provide additional protection even under strong turbulence regimes. Automatic Repeat Request (ARQ) is another method of improving link reliability. Performance of ARQ is limited by the number of retransmissions and the corresponding time delay. We prove through theoretical computations and simulations that LT codes consume less energy per bit. We validate the feasibility of using energy efficient LT codes over ARQ for FSO links to be used in optical wireless sensor networks within the eye safety limits.

  4. Exceptional error minimization in putative primordial genetic codes

    PubMed Central

    2009-01-01

    Background The standard genetic code is redundant and has a highly non-random structure. Codons for the same amino acids typically differ only by the nucleotide in the third position, whereas similar amino acids are encoded, mostly, by codon series that differ by a single base substitution in the third or the first position. As a result, the code is highly albeit not optimally robust to errors of translation, a property that has been interpreted either as a product of selection directed at the minimization of errors or as a non-adaptive by-product of evolution of the code driven by other forces. Results We investigated the error-minimization properties of putative primordial codes that consisted of 16 supercodons, with the third base being completely redundant, using a previously derived cost function and the error minimization percentage as the measure of a code's robustness to mistranslation. It is shown that, when the 16-supercodon table is populated with 10 putative primordial amino acids, inferred from the results of abiotic synthesis experiments and other evidence independent of the code's evolution, and with minimal assumptions used to assign the remaining supercodons, the resulting 2-letter codes are nearly optimal in terms of the error minimization level. Conclusion The results of the computational experiments with putative primordial genetic codes that contained only two meaningful letters in all codons and encoded 10 to 16 amino acids indicate that such codes are likely to have been nearly optimal with respect to the minimization of translation errors. This near-optimality could be the outcome of extensive early selection during the co-evolution of the code with the primordial, error-prone translation system, or a result of a unique, accidental event. Under this hypothesis, the subsequent expansion of the code resulted in a decrease of the error minimization level that became sustainable owing to the evolution of a high-fidelity translation system. Reviewers This article was reviewed by Paul Higgs (nominated by Arcady Mushegian), Rob Knight, and Sandor Pongor. For the complete reports, go to the Reviewers' Reports section. PMID:19925661

  5. Non-White, No More: Effect Coding as an Alternative to Dummy Coding with Implications for Higher Education Researchers

    ERIC Educational Resources Information Center

    Mayhew, Matthew J.; Simonoff, Jeffrey S.

    2015-01-01

    The purpose of this article is to describe effect coding as an alternative quantitative practice for analyzing and interpreting categorical, race-based independent variables in higher education research. Unlike indicator (dummy) codes that imply that one group will be a reference group, effect codes use average responses as a means for…

  6. Non-White, No More: Effect Coding as an Alternative to Dummy Coding with Implications for Higher Education Researchers

    ERIC Educational Resources Information Center

    Mayhew, Matthew J.; Simonoff, Jeffrey S.

    2015-01-01

    The purpose of this article is to describe effect coding as an alternative quantitative practice for analyzing and interpreting categorical, race-based independent variables in higher education research. Unlike indicator (dummy) codes that imply that one group will be a reference group, effect codes use average responses as a means for

  7. The Proteomic Code: a molecular recognition code for proteins

    PubMed Central

    Biro, Jan C

    2007-01-01

    Background The Proteomic Code is a set of rules by which information in genetic material is transferred into the physico-chemical properties of amino acids. It determines how individual amino acids interact with each other during folding and in specific protein-protein interactions. The Proteomic Code is part of the redundant Genetic Code. Review The 25-year-old history of this concept is reviewed from the first independent suggestions by Biro and Mekler, through the works of Blalock, Root-Bernstein, Siemion, Miller and others, followed by the discovery of a Common Periodic Table of Codons and Nucleic Acids in 2003 and culminating in the recent conceptualization of partial complementary coding of interacting amino acids as well as the theory of the nucleic acid-assisted protein folding. Methods and conclusions A novel cloning method for the design and production of specific, high-affinity-reacting proteins (SHARP) is presented. This method is based on the concept of proteomic codes and is suitable for large-scale, industrial production of specifically interacting peptides. PMID:17999762

  8. Independent Replication and Meta-Analysis for Endometriosis Risk Loci.

    PubMed

    Sapkota, Yadav; Fassbender, Amelie; Bowdler, Lisa; Fung, Jenny N; Peterse, Daniëlle; O, Dorien; Montgomery, Grant W; Nyholt, Dale R; D'Hooghe, Thomas M

    2015-10-01

    Endometriosis is a complex disease that affects 6-10% of women in their reproductive years and 20-50% of women with infertility. Genome-wide and candidate-gene association studies for endometriosis have identified 10 independent risk loci, and of these, nine (rs7521902, rs13394619, rs4141819, rs6542095, rs1519761, rs7739264, rs12700667, rs1537377, and rs10859871) are polymorphic in European populations. Here we investigate the replication of nine SNP loci in 998 laparoscopically and histologically confirmed endometriosis cases and 783 disease-free controls from Belgium. SNPs rs7521902, rs13394619, and rs6542095 show nominally significant (p < .05) associations with endometriosis, while the directions of effect for seven SNPs are consistent with the original reports. Association of rs6542095 at the IL1A locus with 'All' (p = .066) and 'Grade_B' (p = .01) endometriosis is noteworthy because this is the first successful replication in an independent population. Meta-analysis with the published results yields genome-wide significant evidence for rs7521902, rs13394619, rs6542095, rs12700667, rs7739264, and rs1537377. Notably, three coding variants in GREB1 (near rs13394619) and CDKN2B-AS1 (near rs1537377) also showed nominally significant associations with endometriosis. Overall, this study provides important replication in a uniquely characterized independent population, and indicates that the majority of the original genome-wide association findings are not due to chance alone. PMID:26337243

  9. Dual-code quantum computation model

    NASA Astrophysics Data System (ADS)

    Choi, Byung-Soo

    2015-08-01

    In this work, we propose the dual-code quantum computation modela fault-tolerant quantum computation scheme which alternates between two different quantum error-correction codes. Since the chosen two codes have different sets of transversal gates, we can implement a universal set of gates transversally, thereby reducing the overall cost. We use code teleportation to convert between quantum states in different codes. The overall cost is decreased if code teleportation requires fewer resources than the fault-tolerant implementation of the non-transversal gate in a specific code. To analyze the cost reduction, we investigate two cases with different base codes, namely the Steane and Bacon-Shor codes. For the Steane code, neither the proposed dual-code model nor another variation of it achieves any cost reduction since the conventional approach is simple. For the Bacon-Shor code, the three proposed variations of the dual-code model reduce the overall cost. However, as the encoding level increases, the cost reduction decreases and becomes negative. Therefore, the proposed dual-code model is advantageous only when the encoding level is low and the cost of the non-transversal gate is relatively high.

  10. Pulsed Inductive Thruster (PIT): Modeling and Validation Using the MACH2 Code

    NASA Technical Reports Server (NTRS)

    Schneider, Steven (Technical Monitor); Mikellides, Pavlos G.

    2003-01-01

    Numerical modeling of the Pulsed Inductive Thruster exercising the magnetohydrodynamics code, MACH2 aims to provide bilateral validation of the thruster's measured performance and the code's capability of capturing the pertinent physical processes. Computed impulse values for helium and argon propellants demonstrate excellent correlation to the experimental data for a range of energy levels and propellant-mass values. The effects of the vacuum tank wall and massinjection scheme were investigated to show trivial changes in the overall performance. An idealized model for these energy levels and propellants deduces that the energy expended to the internal energy modes and plasma dissipation processes is independent of the propellant type, mass, and energy level.

  11. Correlated algebraic-geometric codes

    NASA Astrophysics Data System (ADS)

    Guruswami, Venkatesan; Patthak, Anindya C.

    2008-03-01

    We define a new family of error-correcting codes based on algebraic curves over finite fields, and develop efficient list decoding algorithms for them. Our codes extend the class of algebraic-geometric (AG) codes via a (nonobvious) generalization of the approach in the recent breakthrough work of Parvaresh and Vardy (2005). Our work shows that the PV framework applies to fairly general settings by elucidating the key algebraic concepts underlying it. Also, more importantly, AG codes of arbitrary block length exist over fixed alphabets Sigma , thus enabling us to establish new trade-offs between the list decoding radius and rate over a bounded alphabet size. The work of Parvaresh and Vardy (2005) was extended in Guruswami and Rudra (2006) to give explicit codes that achieve the list decoding capacity (optimal trade-off between rate and fraction of errors corrected) over large alphabets. A similar extension of this work along the lines of Guruswami and Rudra could have substantial impact. Indeed, it could give better trade-offs than currently known over a fixed alphabet (say, GF(2^{12}) ), which in turn, upon concatenation with a fixed, well-understood binary code, could take us closer to the list decoding capacity for binary codes. This may also be a promising way to address the significant complexity drawback of the result of Guruswami and Rudra, and to enable approaching capacity with bounded list size independent of the block length (the list size and decoding complexity in their work are both n^{Omega(1/\\varepsilon)} where \\varepsilon is the distance to capacity). Similar to algorithms for AG codes from Guruswami and Sudan (1999) and (2001), our encoding/decoding algorithms run in polynomial time assuming a natural polynomial-size representation of the code. For codes based on a specific ``optimal'' algebraic curve, we also present an expected polynomial time algorithm to construct the requisite representation. This in turn fills an important void in the literature by presenting an efficient construction of the representation often assumed in the list decoding algorithms for AG codes.

  12. Minimizing correlation effect using zero cross correlation code in spectral amplitude coding optical code division multiple access

    NASA Astrophysics Data System (ADS)

    Safar, Anuar Mat; Aljunid, Syed Alwee; Arief, Amir Razif; Nordin, Junita; Saad, Naufal

    2012-01-01

    The use of minimal multiple access interference (MAI) in code design is investigated. Applying a projection and mapping techniques, a code that has a zero cross correlation (ZCC) between users in optical code division multiple access (OCDMA) is presented in this paper. The system is based on an incoherent light source—LED, spectral amplitude coding (SAC), and direct detection techniques at the receiver. Using power spectral density (PSD) function and Gaussian approximation, we obtain the signal-to-noise ratio (SNR) and the bit-error rate (BER) to measure the code performance. Making a comparison with other existing codes, e.g., Hadamard, MFH and MDW codes, we show that our code performs better at BER 10-9 in terms of number of simultaneous users. We also demonstrate the comparison between the theoretical and simulation analyses, where the results are close to one another.

  13. AEST: Adaptive Eigenvalue Stability Code

    NASA Astrophysics Data System (ADS)

    Zheng, L.-J.; Kotschenreuther, M.; Waelbroeck, F.; van Dam, J. W.; Berk, H.

    2002-11-01

    An adaptive eigenvalue linear stability code is developed. The aim is on one hand to include the non-ideal MHD effects into the global MHD stability calculation for both low and high n modes and on the other hand to resolve the numerical difficulty involving MHD singularity on the rational surfaces at the marginal stability. Our code follows some parts of philosophy of DCON by abandoning relaxation methods based on radial finite element expansion in favor of an efficient shooting procedure with adaptive gridding. The δ W criterion is replaced by the shooting procedure and subsequent matrix eigenvalue problem. Since the technique of expanding a general solution into a summation of the independent solutions employed, the rank of the matrices involved is just a few hundreds. This makes easier to solve the eigenvalue problem with non-ideal MHD effects, such as FLR or even full kinetic effects, as well as plasma rotation effect, taken into account. To include kinetic effects, the approach of solving for the distribution function as a local eigenvalue ω problem as in the GS2 code will be employed in the future. Comparison of the ideal MHD version of the code with DCON, PEST, and GATO will be discussed. The non-ideal MHD version of the code will be employed to study as an application the transport barrier physics in tokamak discharges.

  14. Independent NOAA considered

    NASA Astrophysics Data System (ADS)

    Richman, Barbara T.

    A proposal to pull the National Oceanic and Atmospheric Administration (NOAA) out of the Department of Commerce and make it an independent agency was the subject of a recent congressional hearing. Supporters within the science community and in Congress said that an independent NOAA will benefit by being more visible and by not being tied to a cabinet-level department whose main concerns lie elsewhere. The proposal's critics, however, cautioned that making NOAA independent could make it even more vulnerable to the budget axe and would sever the agency's direct access to the President.The separation of NOAA from Commerce was contained in a June 1 proposal by President Ronald Reagan that also called for all federal trade functions under the Department of Commerce to be reorganized into a new Department of International Trade and Industry (DITI).

  15. Independent technical review, handbook

    SciTech Connect

    Not Available

    1994-02-01

    Purpose Provide an independent engineering review of the major projects being funded by the Department of Energy, Office of Environmental Restoration and Waste Management. The independent engineering review will address questions of whether the engineering practice is sufficiently developed to a point where a major project can be executed without significant technical problems. The independent review will focus on questions related to: (1) Adequacy of development of the technical base of understanding; (2) Status of development and availability of technology among the various alternatives; (3) Status and availability of the industrial infrastructure to support project design, equipment fabrication, facility construction, and process and program/project operation; (4) Adequacy of the design effort to provide a sound foundation to support execution of project; (5) Ability of the organization to fully integrate the system, and direct, manage, and control the execution of a complex major project.

  16. V(D)J recombination coding junction formation without DNA homology: processing of coding termini.

    PubMed Central

    Boubnov, N V; Wills, Z P; Weaver, D T

    1993-01-01

    Coding junction formation in V(D)J recombination generates diversity in the antigen recognition structures of immunoglobulin and T-cell receptor molecules by combining processes of deletion of terminal coding sequences and addition of nucleotides prior to joining. We have examined the role of coding end DNA composition in junction formation with plasmid substrates containing defined homopolymers flanking the recombination signal sequence elements. We found that coding junctions formed efficiently with or without terminal DNA homology. The extent of junctional deletion was conserved independent of coding ends with increased, partial, or no DNA homology. Interestingly, G/C homopolymer coding ends showed reduced deletion regardless of DNA homology. Therefore, DNA homology cannot be the primary determinant that stabilizes coding end structures for processing and joining. PMID:8413286

  17. Bitplane Image Coding With Parallel Coefficient Processing.

    PubMed

    Auli-Llinas, Francesc; Enfedaque, Pablo; Moure, Juan C; Sanchez, Victor

    2016-01-01

    Image coding systems have been traditionally tailored for multiple instruction, multiple data (MIMD) computing. In general, they partition the (transformed) image in codeblocks that can be coded in the cores of MIMD-based processors. Each core executes a sequential flow of instructions to process the coefficients in the codeblock, independently and asynchronously from the others cores. Bitplane coding is a common strategy to code such data. Most of its mechanisms require sequential processing of the coefficients. The last years have seen the upraising of processing accelerators with enhanced computational performance and power efficiency whose architecture is mainly based on the single instruction, multiple data (SIMD) principle. SIMD computing refers to the execution of the same instruction to multiple data in a lockstep synchronous way. Unfortunately, current bitplane coding strategies cannot fully profit from such processors due to inherently sequential coding task. This paper presents bitplane image coding with parallel coefficient (BPC-PaCo) processing, a coding method that can process many coefficients within a codeblock in parallel and synchronously. To this end, the scanning order, the context formation, the probability model, and the arithmetic coder of the coding engine have been re-formulated. The experimental results suggest that the penalization in coding performance of BPC-PaCo with respect to the traditional strategies is almost negligible. PMID:26441420

  18. An introduction to QR Codes: linking libraries and mobile patrons.

    PubMed

    Hoy, Matthew B

    2011-01-01

    QR codes, or "Quick Response" codes, are two-dimensional barcodes that can be scanned by mobile smartphone cameras. These codes can be used to provide fast access to URLs, telephone numbers, and short passages of text. With the rapid adoption of smartphones, librarians are able to use QR codes to promote services and help library users find materials quickly and independently. This article will explain what QR codes are, discuss how they can be used in the library, and describe issues surrounding their use. A list of resources for generating and scanning QR codes is also provided. PMID:21800986

  19. Accumulate repeat accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

  20. Touchstones of Independence.

    ERIC Educational Resources Information Center

    Roha, Thomas Arden

    1999-01-01

    Foundations affiliated with public higher education institutions can avoid having to open records for public scrutiny, by having independent boards of directors, occupying leased office space or paying market value for university space, using only foundation personnel, retaining legal counsel, being forthcoming with information and use of public…

  1. Independent Living Course

    ERIC Educational Resources Information Center

    Tipping, Joyce

    1978-01-01

    Designed to help handicapped persons who have been living a sheltered existence develop independent living skills, this course is divided into two parts. The first part consists of a five-day apartment live-in experience, and the second concentrates on developing the learners' awareness of community resources and consumer skills. (BM)

  2. Caring about Independent Lives

    ERIC Educational Resources Information Center

    Christensen, Karen

    2010-01-01

    With the rhetoric of independence, new cash for care systems were introduced in many developed welfare states at the end of the 20th century. These systems allow local authorities to pay people who are eligible for community care services directly, to enable them to employ their own careworkers. Despite the obvious importance of the careworker's…

  3. Independence, Disengagement, and Discipline

    ERIC Educational Resources Information Center

    Rubin, Ron

    2012-01-01

    School disengagement is linked to a lack of opportunities for students to fulfill their needs for independence and self-determination. Young people have little say about what, when, where, and how they will learn, the criteria used to assess their success, and the content of school and classroom rules. Traditional behavior management discourages

  4. Independent Human Studies.

    ERIC Educational Resources Information Center

    Kaplan, Suzanne; Wilson, Gordon

    1978-01-01

    The Independent Human Studies program at Schoolcraft College offers an alternative method of earning academic credits. Students delineate an area of study, pose research questions, gather resources, synthesize the information, state the thesis, choose the method of presentation, set schedules, and take responsibility for meeting deadlines. (MB)

  5. Concatenated Coding Using Trellis-Coded Modulation

    NASA Technical Reports Server (NTRS)

    Thompson, Michael W.

    1997-01-01

    In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.

  6. Coset Codes Viewed as Terminated Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1996-01-01

    In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.

  7. Efficient codes and balanced networks.

    PubMed

    Denève, Sophie; Machens, Christian K

    2016-02-23

    Recent years have seen a growing interest in inhibitory interneurons and their circuits. A striking property of cortical inhibition is how tightly it balances excitation. Inhibitory currents not only match excitatory currents on average, but track them on a millisecond time scale, whether they are caused by external stimuli or spontaneous fluctuations. We review, together with experimental evidence, recent theoretical approaches that investigate the advantages of such tight balance for coding and computation. These studies suggest a possible revision of the dominant view that neurons represent information with firing rates corrupted by Poisson noise. Instead, tight excitatory/inhibitory balance may be a signature of a highly cooperative code, orders of magnitude more precise than a Poisson rate code. Moreover, tight balance may provide a template that allows cortical neurons to construct high-dimensional population codes and learn complex functions of their inputs. PMID:26906504

  8. Molecular cloning of canine co-chaperone small glutamine-rich tetratricopeptide repeat-containing protein α (SGTA) and investigation of its ability to suppress androgen receptor signalling in androgen-independent prostate cancer.

    PubMed

    Kato, Yuiko; Ochiai, Kazuhiko; Michishita, Masaki; Azakami, Daigo; Nakahira, Rei; Morimatsu, Masami; Ishiguro-Oonuma, Toshina; Yoshikawa, Yasunaga; Kobayashi, Masato; Bonkobara, Makoto; Kobayashi, Masanori; Takahashi, Kimimasa; Watanabe, Masami; Omi, Toshinori

    2015-11-01

    Although the morbidity of canine prostate cancer is low, the majority of cases present with resistance to androgen therapy and poor clinical outcomes. These pathological conditions are similar to the signs of the terminal stage of human androgen-independent prostate cancer. The co-chaperone small glutamine-rich tetratricopeptide repeat-containing protein α (SGTA) is known to be overexpressed in human androgen-independent prostate cancer. However, there is little information about the structure and function of canine SGTA. In this study, canine SGTA was cloned and analysed for its ability to suppress androgen receptor signalling. The full-length open reading frame (ORF) of the canine SGTA gene was amplified by RT-PCR using primers designed from canine-expressed sequence tags that were homologous to human SGTA. The canine SGTA ORF has high homology with the corresponding human (89%) and mouse (81%) sequences. SGTA dimerisation region and tetratricopeptide repeat (TPR) domains are conserved across the three species. The ability of canine SGTA to undergo homodimerisation was demonstrated by a mammalian two-hybrid system and a pull-down assay. The negative impact of canine SGTA on androgen receptor (AR) signalling was demonstrated using a reporter assay in androgen-independent human prostate cancer cell lines. Pathological analysis showed overexpression of SGTA in canine prostate cancer, but not in hyperplasia. A reporter assay in prostate cells demonstrated suppression of AR signalling by canine SGTA. Altogether, these results suggest that canine SGTA may play an important role in the acquisition of androgen independence by canine prostate cancer cells. PMID:26346258

  9. Medical imaging with coded apertures

    SciTech Connect

    Keto, E.; Libby, S.

    1995-06-16

    Now algorithms were investigated for image reconstruction in emission tomography which could incorporate complex instrumental effects such as might be obtained with a coded aperture system. The investigation focused on possible uses of the wavelet transform to handle non-stationary instrumental effects and analytic continuation of the Radon transform to handle self-absorption. Neither investigation was completed during the funding period and whether such algorithms will be useful remains an open question.

  10. Transionospheric Propagation Code (TIPC)

    SciTech Connect

    Roussel-Dupre, R.; Kelley, T.A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of vhf signals following propagation through the ionosphere. The code is written in Fortran 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, DTOA study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of delta-times-of-arrival (DTOAs) vs TECs for a specified pair of receivers.

  11. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  12. Manually operated coded switch

    DOEpatents

    Barnette, Jon H.

    1978-01-01

    The disclosure relates to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made.

  13. Cyclic unequal error protection codes constructed from cyclic codes of composite length

    NASA Technical Reports Server (NTRS)

    Lin, Mao-Chao; Lin, Shu

    1988-01-01

    The unequal error correction capabilities of binary cyclic codes of composite length are investigated. Under certain conditions, direct sums of concatenated codes have unequal error correction capabilities. By a modified Hartmann and Tzeng algorithm, it is shown that a binary cyclic code of composite length is equivalent to the direct sum of concatenated codes. With this, some binary cyclic unequal error protection (UEP) codes are constructed. Finally, two-level UEP cyclic direct-sum codes are presented which provide error correction capabilities higher than those guaranteed by the Blokh-Zyablov constructions.

  14. Code-Switching: L1-Coded Mediation in a Kindergarten Foreign Language Classroom

    ERIC Educational Resources Information Center

    Lin, Zheng

    2012-01-01

    This paper is based on a qualitative inquiry that investigated the role of teachers' mediation in three different modes of coding in a kindergarten foreign language classroom in China (i.e. L2-coded intralinguistic mediation, L1-coded cross-lingual mediation, and L2-and-L1-mixed mediation). Through an exploratory examination of the varying effects…

  15. QR Codes 101

    ERIC Educational Resources Information Center

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  16. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  17. Groundwater flow code verification ``benchmarking`` activity (COVE-2A): Analysis of participants` work

    SciTech Connect

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project.

  18. Certifying Auto-Generated Flight Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen

    2008-01-01

    Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm itself is generic, and parametrized with respect to a library of coding patterns that depend on the safety policies and the code generator. The patterns characterize the notions of definitions and uses that are specific to the given safety property. For example, for initialization safety, definitions correspond to variable initializations while uses are statements which read a variable, whereas for array bounds safety, definitions are the array declarations, while uses are statements which access an array variable. The inferred annotations are thus highly dependent on the actual program and the properties being proven. The annotations, themselves, need not be trusted, but are crucial to obtain the automatic formal verification of the safety properties without requiring access to the internals of the code generator. The approach has been applied to both in-house and commercial code generators, but is independent of the particular generator used. It is currently being adapted to flight code generated using MathWorks Real-Time Workshop, an automatic code generator that translates from Simulink/Stateflow models into embedded C code.

  19. Subsystem codes with spatially local generators

    SciTech Connect

    Bravyi, Sergey

    2011-01-15

    We study subsystem codes whose gauge group has local generators in two-dimensional (2D) geometry. It is shown that there exists a family of such codes defined on lattices of size LxL with the number of logical qubits k and the minimum distance d both proportional to L. The gauge group of these codes involves only two-qubit generators of type XX and ZZ coupling nearest-neighbor qubits (and some auxiliary one-qubit generators). Our proof is not constructive as it relies on a certain version of the Gilbert-Varshamov bound for classical codes. Along the way, we introduce and study properties of generalized Bacon-Shor codes that might be of independent interest. Secondly, we prove that any 2D subsystem [n,k,d] code with spatially local generators obeys upper bounds kd=O(n) and d{sup 2}=O(n). The analogous upper bound proved recently for 2D stabilizer codes is kd{sup 2}=O(n). Our results thus demonstrate that subsystem codes can be more powerful than stabilizer codes under the spatial locality constraint.

  20. Coded continuous wave meteor radar

    NASA Astrophysics Data System (ADS)

    Vierinen, Juha; Chau, Jorge L.; Pfeffer, Nico; Clahsen, Matthias; Stober, Gunter

    2016-03-01

    The concept of a coded continuous wave specular meteor radar (SMR) is described. The radar uses a continuously transmitted pseudorandom phase-modulated waveform, which has several advantages compared to conventional pulsed SMRs. The coding avoids range and Doppler aliasing, which are in some cases problematic with pulsed radars. Continuous transmissions maximize pulse compression gain, allowing operation at lower peak power than a pulsed system. With continuous coding, the temporal and spectral resolution are not dependent on the transmit waveform and they can be fairly flexibly changed after performing a measurement. The low signal-to-noise ratio before pulse compression, combined with independent pseudorandom transmit waveforms, allows multiple geographically separated transmitters to be used in the same frequency band simultaneously without significantly interfering with each other. Because the same frequency band can be used by multiple transmitters, the same interferometric receiver antennas can be used to receive multiple transmitters at the same time. The principles of the signal processing are discussed, in addition to discussion of several practical ways to increase computation speed, and how to optimally detect meteor echoes. Measurements from a campaign performed with a coded continuous wave SMR are shown and compared with two standard pulsed SMR measurements. The type of meteor radar described in this paper would be suited for use in a large-scale multi-static network of meteor radar transmitters and receivers. Such a system would be useful for increasing the number of meteor detections to obtain improved meteor radar data products.

  1. Decoder for 3-D color codes

    NASA Astrophysics Data System (ADS)

    Hsu, Kung-Chuan; Brun, Todd

    Transversal circuits are important components of fault-tolerant quantum computation. Several classes of quantum error-correcting codes are known to have transversal implementations of any logical Clifford operation. However, to achieve universal quantum computation, it would be helpful to have high-performance error-correcting codes that have a transversal implementation of some logical non-Clifford operation. The 3-D color codes are a class of topological codes that permit transversal implementation of the logical π / 8 -gate. The decoding problem of a 3-D color code can be understood as a graph-matching problem on a three-dimensional lattice. Whether this class of codes will be useful in terms of performance is still an open question. We investigate the decoding problem of 3-D color codes and analyze the performance of some possible decoders.

  2. Genetic code, hamming distance and stochastic matrices.

    PubMed

    He, Matthew X; Petoukhov, Sergei V; Ricci, Paolo E

    2004-09-01

    In this paper we use the Gray code representation of the genetic code C=00, U=10, G=11 and A=01 (C pairs with G, A pairs with U) to generate a sequence of genetic code-based matrices. In connection with these code-based matrices, we use the Hamming distance to generate a sequence of numerical matrices. We then further investigate the properties of the numerical matrices and show that they are doubly stochastic and symmetric. We determine the frequency distributions of the Hamming distances, building blocks of the matrices, decomposition and iterations of matrices. We present an explicit decomposition formula for the genetic code-based matrix in terms of permutation matrices, which provides a hypercube representation of the genetic code. It is also observed that there is a Hamiltonian cycle in a genetic code-based hypercube. PMID:15294430

  3. Code of Practice and Competencies for ISAs

    NASA Astrophysics Data System (ADS)

    Kinnersly, Steve; Spalding, Ian

    Independent safety assessment is widely used as a means of obtaining assurance of safety for safety related systems. Experience of both Independent Safety Assessors (ISAs) and users of ISAs, together with growing appreciation of the responsibilities and potential liabilities of ISAs, suggested that there would be safety assurance and other benefits from identifying good practice for ISAs. A voluntary Code of Practice for Independent Safety Assessors (ISAs), together with a supporting Competency Framework for ISAs, has therefore been developed by the ISA Working Group of the Institution of Engineering and Technology (IET) and the British Computer Society (BCS).

  4. Asymmetric quantum convolutional codes

    NASA Astrophysics Data System (ADS)

    La Guardia, Giuliano G.

    2016-01-01

    In this paper, we construct the first families of asymmetric quantum convolutional codes (AQCCs). These new AQCCs are constructed by means of the CSS-type construction applied to suitable families of classical convolutional codes, which are also constructed here. The new codes have non-catastrophic generator matrices, and they have great asymmetry. Since our constructions are performed algebraically, i.e. we develop general algebraic methods and properties to perform the constructions, it is possible to derive several families of such codes and not only codes with specific parameters. Additionally, several different types of such codes are obtained.

  5. Independent component analysis of parameterized ECG signals.

    PubMed

    Tanskanen, Jarno M A; Viik, Jari J; Hyttinen, Jari A K

    2006-01-01

    Independent component analysis (ICA) of measured signals yields the independent sources, given certain fulfilled requirements. Properly parameterized signals provide a better view to the considered system aspects, while reducing the amount of data. It is little acknowledged that appropriately parameterized signals may be subjected to ICA, yielding independent components (ICs) displaying more clearly the investigated properties of the sources. In this paper, we propose ICA of parameterized signals, and demonstrate the concept with ICA of ST and R parameterizations of electrocardiogram (ECG) signals from ECG exercise test measurements from two coronary artery disease (CAD) patients. PMID:17945912

  6. Myth or Truth: Independence Day.

    ERIC Educational Resources Information Center

    Gardner, Traci

    Most Americans think of the Fourth of July as Independence Day, but is it really the day the U.S. declared and celebrated independence? By exploring myths and truths surrounding Independence Day, this lesson asks students to think critically about commonly believed stories regarding the beginning of the Revolutionary War and the Independence Day…

  7. Cary Potter on Independent Education

    ERIC Educational Resources Information Center

    Potter, Cary

    1978-01-01

    Cary Potter was President of the National Association of Independent Schools from 1964-1978. As he leaves NAIS he gives his views on education, on independence, on the independent school, on public responsibility, on choice in a free society, on educational change, and on the need for collective action by independent schools. (Author/RK)

  8. Effect of Color Coding on Cognitive Style.

    ERIC Educational Resources Information Center

    Dwyer, Francis M.; Moore, David M.

    The purpose of this study was to examine the effect that coding (black and white or color) has on the achievement of students categorized as field dependent (FD) and field independent (FI) learners and to determine if there was any interaction between these variables (field dependency and color) across both visually and verbally oriented tests…

  9. Reviewing the Challenges and Opportunities Presented by Code Switching and Mixing in Bangla

    ERIC Educational Resources Information Center

    Hasan, Md. Kamrul; Akhand, Mohd. Moniruzzaman

    2015-01-01

    This paper investigates the issues related to code-switching/code-mixing in an ESL context. Some preliminary data on Bangla-English code-switching/code-mixing has been analyzed in order to determine which structural pattern of code-switching/code-mixing is predominant in different social strata. This study also explores the relationship of…

  10. Reviewing the Challenges and Opportunities Presented by Code Switching and Mixing in Bangla

    ERIC Educational Resources Information Center

    Hasan, Md. Kamrul; Akhand, Mohd. Moniruzzaman

    2014-01-01

    This paper investigates the issues related to code-switching/code-mixing in an ESL context. Some preliminary data on Bangla-English code-switching/code-mixing has been analyzed in order to determine which structural pattern of code-switching/code-mixing is predominant in different social strata. This study also explores the relationship of…

  11. Independent task Fourier filters

    NASA Astrophysics Data System (ADS)

    Caulfield, H. John

    2001-11-01

    Since the early 1960s, a major part of optical computing systems has been Fourier pattern recognition, which takes advantage of high speed filter changes to enable powerful nonlinear discrimination in `real time.' Because filter has a task quite independent of the tasks of the other filters, they can be applied and evaluated in parallel or, in a simple approach I describe, in sequence very rapidly. Thus I use the name ITFF (independent task Fourier filter). These filters can also break very complex discrimination tasks into easily handled parts, so the wonderful space invariance properties of Fourier filtering need not be sacrificed to achieve high discrimination and good generalizability even for ultracomplex discrimination problems. The training procedure proceeds sequentially, as the task for a given filter is defined a posteriori by declaring it to be the discrimination of particular members of set A from all members of set B with sufficient margin. That is, we set the threshold to achieve the desired margin and note the A members discriminated by that threshold. Discriminating those A members from all members of B becomes the task of that filter. Those A members are then removed from the set A, so no other filter will be asked to perform that already accomplished task.

  12. Optimal superdense coding over memory channels

    SciTech Connect

    Shadman, Z.; Kampermann, H.; Bruss, D.; Macchiavello, C.

    2011-10-15

    We study the superdense coding capacity in the presence of quantum channels with correlated noise. We investigate both the cases of unitary and nonunitary encoding. Pauli channels for arbitrary dimensions are treated explicitly. The superdense coding capacity for some special channels and resource states is derived for unitary encoding. We also provide an example of a memory channel where nonunitary encoding leads to an improvement in the superdense coding capacity.

  13. Maximal dinucleotide comma-free codes.

    PubMed

    Fimmel, Elena; Strüngmann, Lutz

    2016-01-21

    The problem of retrieval and maintenance of the correct reading frame plays a significant role in RNA transcription. Circular codes, and especially comma-free codes, can help to understand the underlying mechanisms of error-detection in this process. In recent years much attention has been paid to the investigation of trinucleotide circular codes (see, for instance, Fimmel et al., 2014; Fimmel and Strüngmann, 2015a; Michel and Pirillo, 2012; Michel et al., 2012, 2008), while dinucleotide codes had been touched on only marginally, even though dinucleotides are associated to important biological functions. Recently, all maximal dinucleotide circular codes were classified (Fimmel et al., 2015; Michel and Pirillo, 2013). The present paper studies maximal dinucleotide comma-free codes and their close connection to maximal dinucleotide circular codes. We give a construction principle for such codes and provide a graphical representation that allows them to be visualized geometrically. Moreover, we compare the results for dinucleotide codes with the corresponding situation for trinucleotide maximal self-complementary C(3)-codes. Finally, the results obtained are discussed with respect to Crick׳s hypothesis about frame-shift-detecting codes without commas. PMID:26562635

  14. Cellulases and coding sequences

    SciTech Connect

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-02-20

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  15. Cellulases and coding sequences

    SciTech Connect

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-01-01

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  16. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  17. DIANE multiparticle transport code

    NASA Astrophysics Data System (ADS)

    Caillaud, M.; Lemaire, S.; Ménard, S.; Rathouit, P.; Ribes, J. C.; Riz, D.

    2014-06-01

    DIANE is the general Monte Carlo code developed at CEA-DAM. DIANE is a 3D multiparticle multigroup code. DIANE includes automated biasing techniques and is optimized for massive parallel calculations.

  18. Investigation of the Performance of Various CVD Diamond Crystal Qualities for the Measurement of Radiation Doses from a Low Energy Mammography X-Ray Beam, Compared with MC Code (PENELOPE) Calculations

    NASA Astrophysics Data System (ADS)

    Zakari, Y. I.; Mavunda, R. D.; Nam, T. L.; Keddy, R. J.

    The tissue equivalence of diamond allows for accurate radiation dose determination without large corrections for different attenuation values in biological tissue, but its low Z value limits this advantage however to the lower energy photons such as for example in Mammography X-ray beams. This paper assays the performance of nine Chemical Vapour Deposition (CVD) diamonds for use as radiation sensing material. The specimens fabricated in wafer form are classified as detector grade, optical grade and single crystals. It is well known that the presence of defects in diamonds, including CVD specimens, not only dictates but also affects the responds of diamond to radiation in different ways. In this investigation, tools such as electron spin resonance (ESR), thermoluminescence (TL) Raman spectroscopy and ultra violet (UV) spectroscopy were used to probe each of the samples. The linearity, sensitivity and other characteristics of the detector to photon interaction was analyzed, and from the I-V characteristics. The diamonds categorized into four each, of the so called Detector and Optical grades, and a single crystal CVD were exposed to low X-ray peak voltage range (22 to 27 KVp) with a trans-crystal polarizing fields of 0.4 kV.cm-1, 0.66 kV.cm-1 and 0.8 kV.cm-1. The presentation discusses the presence of defects identifiable by the techniques used and correlates the radiation performance of the three types of crystals to their presence. The choice of a wafer as either a spectrometer or as X-ray dosimeter within the selected energy range was made. The analyses was validated with Monte-Carlo code (PENELOPE)

  19. Software for universal noiseless coding

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Schlutsmeyer, A. P.

    1981-01-01

    An overview is provided of the universal noiseless coding algorithms as well as their relationship to the now available FORTRAN implementations. It is suggested that readers considering investigating the utility of these algorithms for actual applications should consult both NASA's Computer Software Management and Information Center (COSMIC) and descriptions of coding techniques provided by Rice (1979). Examples of applying these techniques have also been given by Rice (1975, 1979, 1980). Attention is given to reversible preprocessing, general implementation instructions, naming conventions, and calling arguments. A general applicability of the considered algorithms to solving practical problems is obtained because most real data sources can be simply transformed into the required form by appropriate preprocessing.

  20. STEEP32 computer code

    NASA Technical Reports Server (NTRS)

    Goerke, W. S.

    1972-01-01

    A manual is presented as an aid in using the STEEP32 code. The code is the EXEC 8 version of the STEEP code (STEEP is an acronym for shock two-dimensional Eulerian elastic plastic). The major steps in a STEEP32 run are illustrated in a sample problem. There is a detailed discussion of the internal organization of the code, including a description of each subroutine.

  1. Progress in The Semantic Analysis of Scientific Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  2. Morse Code Activity Packet.

    ERIC Educational Resources Information Center

    Clinton, Janeen S.

    This activity packet offers simple directions for setting up a Morse Code system appropriate to interfacing with any of several personal computer systems. Worksheets are also included to facilitate teaching Morse Code to persons with visual or other disabilities including blindness, as it is argued that the code is best learned auditorily. (PB)

  3. Frame independent cosmological perturbations

    SciTech Connect

    Prokopec, Tomislav; Weenink, Jan E-mail: j.g.weenink@uu.nl

    2013-09-01

    We compute the third order gauge invariant action for scalar-graviton interactions in the Jordan frame. We demonstrate that the gauge invariant action for scalar and tensor perturbations on one physical hypersurface only differs from that on another physical hypersurface via terms proportional to the equation of motion and boundary terms, such that the evolution of non-Gaussianity may be called unique. Moreover, we demonstrate that the gauge invariant curvature perturbation and graviton on uniform field hypersurfaces in the Jordan frame are equal to their counterparts in the Einstein frame. These frame independent perturbations are therefore particularly useful in relating results in different frames at the perturbative level. On the other hand, the field perturbation and graviton on uniform curvature hypersurfaces in the Jordan and Einstein frame are non-linearly related, as are their corresponding actions and n-point functions.

  4. Diagnosis code assignment: models and evaluation metrics

    PubMed Central

    Perotte, Adler; Pivovarov, Rimma; Natarajan, Karthik; Weiskopf, Nicole; Wood, Frank; Elhadad, Noémie

    2014-01-01

    Background and objective The volume of healthcare data is growing rapidly with the adoption of health information technology. We focus on automated ICD9 code assignment from discharge summary content and methods for evaluating such assignments. Methods We study ICD9 diagnosis codes and discharge summaries from the publicly available Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC II) repository. We experiment with two coding approaches: one that treats each ICD9 code independently of each other (flat classifier), and one that leverages the hierarchical nature of ICD9 codes into its modeling (hierarchy-based classifier). We propose novel evaluation metrics, which reflect the distances among gold-standard and predicted codes and their locations in the ICD9 tree. Experimental setup, code for modeling, and evaluation scripts are made available to the research community. Results The hierarchy-based classifier outperforms the flat classifier with F-measures of 39.5% and 27.6%, respectively, when trained on 20 533 documents and tested on 2282 documents. While recall is improved at the expense of precision, our novel evaluation metrics show a more refined assessment: for instance, the hierarchy-based classifier identifies the correct sub-tree of gold-standard codes more often than the flat classifier. Error analysis reveals that gold-standard codes are not perfect, and as such the recall and precision are likely underestimated. Conclusions Hierarchy-based classification yields better ICD9 coding than flat classification for MIMIC patients. Automated ICD9 coding is an example of a task for which data and tools can be shared and for which the research community can work together to build on shared models and advance the state of the art. PMID:24296907

  5. Performance comparison of combined ECC/RLL codes

    NASA Technical Reports Server (NTRS)

    French, C.; Lin, Y.

    1990-01-01

    In this paper, we present a performance comparison of several combined error correcting/run-lenth limited (ECC/RLL) codes created by concatenating a convolutional code with a run-length limited code. In each case, encoding and decoding are accomplished using a single trellis based on the combined code. Half of the codes under investigation use conventionally (d,k) run-length limited codes, where d is the minimum and k is the maximum allowable run of 0's between 1's. The other half of the combined codes use a special class of (d,k) codes known as distance preserving codes. These codes have the property that pairwise Hamming distances out of the (d,k) encoder are at least as large as the corresponding distances into the encoder (i.e., the codes preserve distance). Thus a combined code, created using a convolutional code concatenated with a distance preserving (d,k) code, will have a free distance (dfree) no smaller than the free distance of the original convolutional code. It should be noted that this does not hold if the (d,k) code was not distance preserving. A computer simulation is used to compare the performance of these two types of codes over the binary symmetric channel for various (d,k) constraints, rates, free distances, and numbers of states. Of particular interest for magnetic recording applications are codes with run-length constraints (1,3), (1,7), and (2,7).

  6. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  7. To code, or not to code?

    PubMed

    Parman, Cindy C

    2003-01-01

    In summary, it is also important to remember the hidden rules: 1) Just because there is a code in the manual, it doesn't mean it can be billed to insurance, or that once billed, it will be reimbursed. 2) Just because a code was paid once, doesn't mean it will ever be paid again--or that you get to keep the money! 3) The healthcare provider is responsible for knowing all the rules, but then it is impossible to know all the rules! And not knowing all the rules can lead to fines, penalties or worse! New codes are added annually (quarterly for OPPS), definitions of existing codes are changed, and it is the responsibility of healthcare providers to keep abreast of all coding updates and changes. In addition, the federal regulations are constantly updated and changed, making compliant billing a moving target. All healthcare entities should focus on complete documentation, the adherence to authoritative coding guidance and the provision of detailed explanations and specialty education to the payor, as necessary. PMID:14619987

  8. Source Code Plagiarism--A Student Perspective

    ERIC Educational Resources Information Center

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  9. Applications of Coding in Network Communications

    ERIC Educational Resources Information Center

    Chang, Christopher SungWook

    2012-01-01

    This thesis uses the tool of network coding to investigate fast peer-to-peer file distribution, anonymous communication, robust network construction under uncertainty, and prioritized transmission. In a peer-to-peer file distribution system, we use a linear optimization approach to show that the network coding framework significantly simplifies…

  10. Coding for Electronic Mail

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  11. Bit-Wise Arithmetic Coding For Compression Of Data

    NASA Technical Reports Server (NTRS)

    Kiely, Aaron

    1996-01-01

    Bit-wise arithmetic coding is data-compression scheme intended especially for use with uniformly quantized data from source with Gaussian, Laplacian, or similar probability distribution function. Code words of fixed length, and bits treated as being independent. Scheme serves as means of progressive transmission or of overcoming buffer-overflow or rate constraint limitations sometimes arising when data compression used.

  12. The Syntax and Psycholinguistics of Bilingual Code Mixing.

    ERIC Educational Resources Information Center

    Sridhar, S. N.; Sridhar, Kamal K.

    1980-01-01

    This paper challenges the characterization of bilingual behavior derived from the code-switching model, and especially the notion of linguistic independence on which psychological studies of bilingualism have focused almost exclusively. While linguists have concentrated on the situational determinants of code-switching, psychologists have focused…

  13. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  14. DLLExternalCode

    SciTech Connect

    Greg Flach, Frank Smith

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  15. DLLExternalCode

    Energy Science and Technology Software Center (ESTSC)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read frommore » files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.« less

  16. On the parallelization of molecular dynamics codes

    NASA Astrophysics Data System (ADS)

    Trabado, G. P.; Plata, O.; Zapata, E. L.

    2002-08-01

    Molecular dynamics (MD) codes present a high degree of spatial data locality and a significant amount of independent computations. However, most of the parallelization strategies are usually based on the manual transformation of sequential programs either by completely rewriting the code with message passing routines or using specific libraries intended for writing new MD programs. In this paper we propose a new library-based approach (DDLY) which supports parallelization of existing short-range MD sequential codes. The novelty of this approach is that it can directly handle the distribution of common data structures used in MD codes to represent data (arrays, Verlet lists, link cells), using domain decomposition. Thus, the insertion of run-time support for distribution and communication in a MD program does not imply significant changes to its structure. The method is simple, efficient and portable. It may be also used to extend existing parallel programming languages, such as HPF.

  17. Astronomy education and the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, Robert J.

    2016-01-01

    The Astrophysics Source Code Library (ASCL) is an online registry of source codes used in refereed astrophysics research. It currently lists nearly 1,200 codes and covers all aspects of computational astrophysics. How can this resource be of use to educators and to the graduate students they mentor? The ASCL serves as a discovery tool for codes that can be used for one's own research. Graduate students can also investigate existing codes to see how common astronomical problems are approached numerically in practice, and use these codes as benchmarks for their own solutions to these problems. Further, they can deepen their knowledge of software practices and techniques through examination of others' codes.

  18. Wear Independent Similarity.

    PubMed

    Steele, Adam; Davis, Alexander; Kim, Joohyung; Loth, Eric; Bayer, Ilker S

    2015-06-17

    This study presents a new factor that can be used to design materials where desired surface properties must be retained under in-system wear and abrasion. To demonstrate this factor, a synthetic nonwetting coating is presented that retains chemical and geometric performance as material is removed under multiple wear conditions: a coarse vitrified abradant (similar to sanding), a smooth abradant (similar to rubbing), and a mild abradant (a blend of sanding and rubbing). With this approach, such a nonwetting material displays unprecedented mechanical durability while maintaining desired performance under a range of demanding conditions. This performance, herein termed wear independent similarity performance (WISP), is critical because multiple mechanisms and/or modes of wear can be expected to occur in many typical applications, e.g., combinations of abrasion, rubbing, contact fatigue, weathering, particle impact, etc. Furthermore, these multiple wear mechanisms tend to quickly degrade a novel surface's unique performance, and thus many promising surfaces and materials never scale out of research laboratories. Dynamic goniometry and scanning electron microscopy results presented herein provide insight into these underlying mechanisms, which may also be applied to other coatings and materials. PMID:26018058

  19. cncRNAs: Bi-functional RNAs with protein coding and non-coding functions

    PubMed Central

    Kumari, Pooja; Sampath, Karuna

    2015-01-01

    For many decades, the major function of mRNA was thought to be to provide protein-coding information embedded in the genome. The advent of high-throughput sequencing has led to the discovery of pervasive transcription of eukaryotic genomes and opened the world of RNA-mediated gene regulation. Many regulatory RNAs have been found to be incapable of protein coding and are hence termed as non-coding RNAs (ncRNAs). However, studies in recent years have shown that several previously annotated non-coding RNAs have the potential to encode proteins, and conversely, some coding RNAs have regulatory functions independent of the protein they encode. Such bi-functional RNAs, with both protein coding and non-coding functions, which we term as ‘cncRNAs’, have emerged as new players in cellular systems. Here, we describe the functions of some cncRNAs identified from bacteria to humans. Because the functions of many RNAs across genomes remains unclear, we propose that RNAs be classified as coding, non-coding or both only after careful analysis of their functions. PMID:26498036

  20. Studying the Independent School Library

    ERIC Educational Resources Information Center

    Cahoy, Ellysa Stern; Williamson, Susan G.

    2008-01-01

    In 2005, the American Association of School Librarians' Independent Schools Section conducted a national survey of independent school libraries. This article analyzes the results of the survey, reporting specialized data and information regarding independent school library budgets, collections, services, facilities, and staffing. Additionally, the…

  1. The National Transport Code Collaboration Module Library

    NASA Astrophysics Data System (ADS)

    Kritz, A. H.; Bateman, G.; Kinsey, J.; Pankin, A.; Onjun, T.; Redd, A.; McCune, D.; Ludescher, C.; Pletzer, A.; Andre, R.; Zakharov, L.; Lodestro, L.; Pearlstein, L. D.; Jong, R.; Houlberg, W.; Strand, P.; Wiley, J.; Valanju, P.; John, H. St.; Waltz, R.; Mandrekas, J.; Mau, T. K.; Carlsson, J.; Braams, B.

    2004-12-01

    This paper reports on the progress in developing a library of code modules under the auspices of the National Transport Code Collaboration (NTCC). Code modules are high quality, fully documented software packages with a clearly defined interface. The modules provide a variety of functions, such as implementing numerical physics models; performing ancillary functions such as I/O or graphics; or providing tools for dealing with common issues in scientific programming such as portability of Fortran codes. Researchers in the plasma community submit code modules, and a review procedure is followed to insure adherence to programming and documentation standards. The review process is designed to provide added confidence with regard to the use of the modules and to allow users and independent reviews to validate the claims of the modules' authors. All modules include source code; clear instructions for compilation of binaries on a variety of target architectures; and test cases with well-documented input and output. All the NTCC modules and ancillary information, such as current standards and documentation, are available from the NTCC Module Library Website http://w3.pppl.gov/NTCC. The goal of the project is to develop a resource of value to builders of integrated modeling codes and to plasma physics researchers generally. Currently, there are more than 40 modules in the module library.

  2. [Quality of coding in acute inpatient care].

    PubMed

    Stausberg, J

    2007-08-01

    Routine data in the electronic patient record are frequently used for secondary purposes. Core elements of the electronic patient record are diagnoses and procedures, coded with the mandatory classifications. Despite the important role of routine data for reimbursement, quality management and health care statistics, there is currently no systematic analysis of coding quality in Germany. Respective concepts and investigations share the difficulty to decide what's right and what's wrong, being at the end of the long process of medical decision making. Therefore, a relevant amount of disagreement has to be accepted. In case of the principal diagnosis, this could be the fact in half of the patients. Plausibility of coding looks much better. After optimization time in hospitals, regular and complete coding can be expected. Whether coding matches reality, as a prerequisite for further use of the data in medicine and health politics, should be investigated in controlled trials in the future. PMID:17676418

  3. Adaptive entropy coded subband coding of images.

    PubMed

    Kim, Y H; Modestino, J W

    1992-01-01

    The authors describe a design approach, called 2-D entropy-constrained subband coding (ECSBC), based upon recently developed 2-D entropy-constrained vector quantization (ECVQ) schemes. The output indexes of the embedded quantizers are further compressed by use of noiseless entropy coding schemes, such as Huffman or arithmetic codes, resulting in variable-rate outputs. Depending upon the specific configurations of the ECVQ and the ECPVQ over the subbands, many different types of SBC schemes can be derived within the generic 2-D ECSBC framework. Among these, the authors concentrate on three representative types of 2-D ECSBC schemes and provide relative performance evaluations. They also describe an adaptive buffer instrumented version of 2-D ECSBC, called 2-D ECSBC/AEC, for use with fixed-rate channels which completely eliminates buffer overflow/underflow problems. This adaptive scheme achieves performance quite close to the corresponding ideal 2-D ECSBC system. PMID:18296138

  4. Bit-wise arithmetic coding for data compression

    NASA Technical Reports Server (NTRS)

    Kiely, A. B.

    1994-01-01

    This article examines the problem of compressing a uniformly quantized independent and identically distributed (IID) source. We present a new compression technique, bit-wise arithmetic coding, that assigns fixed-length codewords to the quantizer output and uses arithmetic coding to compress the codewords, treating the codeword bits as independent. We examine the performance of this method and evaluate the overhead required when used block-adaptively. Simulation results are presented for Gaussian and Laplacian sources. This new technique could be used as the entropy coder in a transform or subband coding system.

  5. Overview of Code Verification

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The verified code for the SIFT Executive is not the code that executes on the SIFT system as delivered. The running versions of the SIFT Executive contain optimizations and special code relating to the messy interface to the hardware broadcast interface and to packing of data to conserve space in the store of the BDX930 processors. The running code was in fact developed prior to and without consideration of any mechanical verification. This was regarded as necessary experimentation with the SIFT hardware and special purpose Pascal compiler. The Pascal code sections cover: the selection of a schedule from the global executive broadcast, scheduling, dispatching, three way voting, and error reporting actions of the SIFT Executive. Not included in these sections of Pascal code are: the global executive, five way voting, clock synchronization, interactive consistency, low level broadcasting, and program loading, initialization, and schedule construction.

  6. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  7. Industrial Code Development

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1991-01-01

    The industrial codes will consist of modules of 2-D and simplified 2-D or 1-D codes, intended for expeditious parametric studies, analysis, and design of a wide variety of seals. Integration into a unified system is accomplished by the industrial Knowledge Based System (KBS), which will also provide user friendly interaction, contact sensitive and hypertext help, design guidance, and an expandable database. The types of analysis to be included with the industrial codes are interfacial performance (leakage, load, stiffness, friction losses, etc.), thermoelastic distortions, and dynamic response to rotor excursions. The first three codes to be completed and which are presently being incorporated into the KBS are the incompressible cylindrical code, ICYL, and the compressible cylindrical code, GCYL.

  8. Generating code adapted for interlinking legacy scalar code and extended vector code

    SciTech Connect

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  9. Narrative compression coding for a channel with errors

    NASA Astrophysics Data System (ADS)

    Bond, James W.

    1988-01-01

    Data compression codes offer the possibility of improving the thruput of existing communication systems in the near term. This study was undertaken to determine if data compression codes could be utilized to provide message compression in a channel with up to a 0.10 bit error rate. The data compression capabilities of codes were investigated by estimating the average number of bits-per-character required to transmit narrative files. The performance of the codes in a channel with errors (a noisy channel) was investigated in terms of the average numbers of characters-decoded-in-error and of characters-printed-in-error-per-bit-error. Results were obtained by encoding four narrative files, which were resident on an IBM-PC and use a 58 character set. The study focused on Huffman codes and suffix/prefix comma-free codes. Other data compression codes, in particular, block codes and some simple variants of block codes, are briefly discussed to place the study results in context. Comma-free codes were found to have the most promising data compression because error propagation due to bit errors are limited to a few characters for these codes. A technique was found to identify a suffix/prefix comma-free code giving nearly the same data compressions as a Huffman code with much less error propagation than the Huffman codes. Greater data compression can be achieved through the use of this comma-free code word assignments based on conditioned probabilities of character occurrence.

  10. Doubled Color Codes

    NASA Astrophysics Data System (ADS)

    Bravyi, Sergey

    Combining protection from noise and computational universality is one of the biggest challenges in the fault-tolerant quantum computing. Topological stabilizer codes such as the 2D surface code can tolerate a high level of noise but implementing logical gates, especially non-Clifford ones, requires a prohibitively large overhead due to the need of state distillation. In this talk I will describe a new family of 2D quantum error correcting codes that enable a transversal implementation of all logical gates required for the universal quantum computing. Transversal logical gates (TLG) are encoded operations that can be realized by applying some single-qubit rotation to each physical qubit. TLG are highly desirable since they introduce no overhead and do not spread errors. It has been known before that a quantum code can have only a finite number of TLGs which rules out computational universality. Our scheme circumvents this no-go result by combining TLGs of two different quantum codes using the gauge-fixing method pioneered by Paetznick and Reichardt. The first code, closely related to the 2D color code, enables a transversal implementation of all single-qubit Clifford gates such as the Hadamard gate and the π / 2 phase shift. The second code that we call a doubled color code provides a transversal T-gate, where T is the π / 4 phase shift. The Clifford+T gate set is known to be computationally universal. The two codes can be laid out on the honeycomb lattice with two qubits per site such that the code conversion requires parity measurements for six-qubit Pauli operators supported on faces of the lattice. I will also describe numerical simulations of logical Clifford+T circuits encoded by the distance-3 doubled color code. Based on a joint work with Andrew Cross.

  11. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  12. Melanism in Peromyscus Is Caused by Independent Mutations in Agouti

    PubMed Central

    Kingsley, Evan P.; Manceau, Marie; Wiley, Christopher D.; Hoekstra, Hopi E.

    2009-01-01

    Identifying the molecular basis of phenotypes that have evolved independently can provide insight into the ways genetic and developmental constraints influence the maintenance of phenotypic diversity. Melanic (darkly pigmented) phenotypes in mammals provide a potent system in which to study the genetic basis of naturally occurring mutant phenotypes because melanism occurs in many mammals, and the mammalian pigmentation pathway is well understood. Spontaneous alleles of a few key pigmentation loci are known to cause melanism in domestic or laboratory populations of mammals, but in natural populations, mutations at one gene, the melanocortin-1 receptor (Mc1r), have been implicated in the vast majority of cases, possibly due to its minimal pleiotropic effects. To investigate whether mutations in this or other genes cause melanism in the wild, we investigated the genetic basis of melanism in the rodent genus Peromyscus, in which melanic mice have been reported in several populations. We focused on two genes known to cause melanism in other taxa, Mc1r and its antagonist, the agouti signaling protein (Agouti). While variation in the Mc1r coding region does not correlate with melanism in any population, in a New Hampshire population, we find that a 125-kb deletion, which includes the upstream regulatory region and exons 1 and 2 of Agouti, results in a loss of Agouti expression and is perfectly associated with melanic color. In a second population from Alaska, we find that a premature stop codon in exon 3 of Agouti is associated with a similar melanic phenotype. These results show that melanism has evolved independently in these populations through mutations in the same gene, and suggest that melanism produced by mutations in genes other than Mc1r may be more common than previously thought. PMID:19649329

  13. The Comparative Performance of Conditional Independence Indices

    ERIC Educational Resources Information Center

    Kim, Doyoung; De Ayala, R. J.; Ferdous, Abdullah A.; Nering, Michael L.

    2011-01-01

    To realize the benefits of item response theory (IRT), one must have model-data fit. One facet of a model-data fit investigation involves assessing the tenability of the conditional item independence (CII) assumption. In this Monte Carlo study, the comparative performance of 10 indices for identifying conditional item dependence is assessed. The…

  14. The Comparative Performance of Conditional Independence Indices

    ERIC Educational Resources Information Center

    Kim, Doyoung; De Ayala, R. J.; Ferdous, Abdullah A.; Nering, Michael L.

    2011-01-01

    To realize the benefits of item response theory (IRT), one must have model-data fit. One facet of a model-data fit investigation involves assessing the tenability of the conditional item independence (CII) assumption. In this Monte Carlo study, the comparative performance of 10 indices for identifying conditional item dependence is assessed. The

  15. Experimental evaluation of photoacoustic coded excitation using unipolar golay codes.

    PubMed

    Mienkina, Martin P; Friedrich, Claus-Stefan; Gerhardt, Nils C; Wilkening, Wilko G; Hofmann, Martin R; Schmitz, Georg

    2010-07-01

    Q-switched Nd:YAG lasers are commonly used as light sources for photoacoustic imaging. However, laser diodes are attractive as an alternative to Nd:YAG lasers because they are less expensive and more compact. Although laser diodes deliver about three orders of magnitude less light pulse energy than Nd:YAG lasers (tens of microjoules compared with tens of millijoules), their pulse repetition frequency (PRF) is four to five orders of magnitude higher (up to 1 MHz compared with tens of hertz); this enables the use of averaging to improve SNR without compromising the image acquisition rate. In photoacoustic imaging, the PRF is limited by the maximum acoustic time-of-flight. This limit can be overcome by using coded excitation schemes in which the coding eliminates ambiguities between echoes induced by subsequent pulses. To evaluate the benefits of photoacoustic coded excitation (PACE), the performance of unipolar Golay codes is investigated analytically and validated experimentally. PACE imaging of a copper slab using laser diodes at a PRF of 1 MHz and a modified clinical ultrasound scanner is successfully demonstrated. Considering laser safety regulations and taking into account a comparison between a laser diode system and Nd:YAG systems with respect to SNR, we conclude that PACE is feasible for small animal imaging. PMID:20639152

  16. Topological subsystem codes

    SciTech Connect

    Bombin, H.

    2010-03-15

    We introduce a family of two-dimensional (2D) topological subsystem quantum error-correcting codes. The gauge group is generated by two-local Pauli operators, so that two-local measurements are enough to recover the error syndrome. We study the computational power of code deformation in these codes and show that boundaries cannot be introduced in the usual way. In addition, we give a general mapping connecting suitable classical statistical mechanical models to optimal error correction in subsystem stabilizer codes that suffer from depolarizing noise.

  17. Expander chunked codes

    NASA Astrophysics Data System (ADS)

    Tang, Bin; Yang, Shenghao; Ye, Baoliu; Yin, Yitong; Lu, Sanglu

    2015-12-01

    Chunked codes are efficient random linear network coding (RLNC) schemes with low computational cost, where the input packets are encoded into small chunks (i.e., subsets of the coded packets). During the network transmission, RLNC is performed within each chunk. In this paper, we first introduce a simple transfer matrix model to characterize the transmission of chunks and derive some basic properties of the model to facilitate the performance analysis. We then focus on the design of overlapped chunked codes, a class of chunked codes whose chunks are non-disjoint subsets of input packets, which are of special interest since they can be encoded with negligible computational cost and in a causal fashion. We propose expander chunked (EC) codes, the first class of overlapped chunked codes that have an analyzable performance, where the construction of the chunks makes use of regular graphs. Numerical and simulation results show that in some practical settings, EC codes can achieve rates within 91 to 97 % of the optimum and outperform the state-of-the-art overlapped chunked codes significantly.

  18. FAA Smoke Transport Code

    SciTech Connect

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a code obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.

  19. GALPROP: New Developments in CR Propagation Code

    NASA Technical Reports Server (NTRS)

    Moskalenko, I. V.; Jones, F. C.; Mashnik, S. G.; Strong, A. W.; Ptuskin, V. S.

    2003-01-01

    The numerical Galactic CR propagation code GALPROP has been shown to reproduce simultaneously observational data of many kinds related to CR origin and propagation. It has been validated on direct measurements of nuclei, antiprotons, electrons, positrons as well as on astronomical measurements of gamma rays and synchrotron radiation. Such data provide many independent constraints on model parameters while revealing some contradictions in the conventional view of Galactic CR propagation. Using a new version of GALPROP we study new effects such as processes of wave-particle interactions in the interstellar medium. We also report about other developments in the CR propagation code.

  20. P-code enhanced method for processing encrypted GPS signals without knowledge of the encryption code

    NASA Technical Reports Server (NTRS)

    Meehan, Thomas K. (Inventor); Thomas, Jr., Jess Brooks (Inventor); Young, Lawrence E. (Inventor)

    2000-01-01

    In the preferred embodiment, an encrypted GPS signal is down-converted from RF to baseband to generate two quadrature components for each RF signal (L1 and L2). Separately and independently for each RF signal and each quadrature component, the four down-converted signals are counter-rotated with a respective model phase, correlated with a respective model P code, and then successively summed and dumped over presum intervals substantially coincident with chips of the respective encryption code. Without knowledge of the encryption-code signs, the effect of encryption-code sign flips is then substantially reduced by selected combinations of the resulting presums between associated quadrature components for each RF signal, separately and independently for the L1 and L2 signals. The resulting combined presums are then summed and dumped over longer intervals and further processed to extract amplitude, phase and delay for each RF signal. Precision of the resulting phase and delay values is approximately four times better than that obtained from straight cross-correlation of L1 and L2. This improved method provides the following options: separate and independent tracking of the L1-Y and L2-Y channels; separate and independent measurement of amplitude, phase and delay L1-Y channel; and removal of the half-cycle ambiguity in L1-Y and L2-Y carrier phase.

  1. Azerbaijani-Russian Code-Switching and Code-Mixing: Form, Function, and Identity

    ERIC Educational Resources Information Center

    Zuercher, Kenneth

    2009-01-01

    From incorporation into the Russian Empire in 1828, through the collapse of the U.S.S.R. in 1991 governmental language policies and other socio/political forces influenced the Turkic population of the Republic of Azerbaijan to speak Russian. Even with changes since independence Russian use--including various kinds of code-switching and…

  2. Code-Switching or Code-Mixing?

    ERIC Educational Resources Information Center

    Thelander, Mats

    1976-01-01

    An attempt to apply Blom's and Gumperz' model of code-switching to a small Swedish community in northern Sweden, Burtrask. The informants spoke standard Swedish, the Burtrask dialect, and a third variety which was a combination of the two. (CFM)

  3. Utilizing GPUs to Accelerate Turbomachinery CFD Codes

    NASA Technical Reports Server (NTRS)

    MacCalla, Weylin; Kulkarni, Sameer

    2016-01-01

    GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.

  4. Fighting for independence.

    PubMed

    Saxon, Emma

    2016-01-01

    Male crickets (Gryllus bimaculatus) establish dominance hierarchies within a population by fighting with one another. Larger males win fights more frequently than their smaller counterparts, and a previous study found that males recognise one another primarily through sensory input from the antennae. This study therefore investigated whether the success of larger crickets is influenced by sensory input from the antennae, in part by assessing the number of fights that large 'antennectomized' crickets won against small crickets, compared with the number that large, intact crickets won. The success rate was significantly lower in antennectomized males, though they still won the majority of fights (73/100 versus 58/100, Fisher's exact test P < 0.05); the authors thus conclude that sensory input from the antennae affects the fighting success of large males, but that other size-related factors also play a part. PMID:26787539

  5. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    PubMed Central

    Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions. PMID:26999741

  6. Quality assurance and verification of the MACCS (MELCOR Accident Consequence Code System) code, Version 1. 5

    SciTech Connect

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E. )

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs.

  7. Dress Codes for Teachers?

    ERIC Educational Resources Information Center

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  8. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  9. Lichenase and coding sequences

    SciTech Connect

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2000-08-15

    The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

  10. Insurance billing and coding.

    PubMed

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms. PMID:18501731

  11. Code blue: seizures.

    PubMed

    Hoerth, Matthew T; Drazkowski, Joseph F; Noe, Katherine H; Sirven, Joseph I

    2011-06-01

    Eyewitnesses frequently perceive seizures as life threatening. If an event occurs on the hospital premises, a "code blue" can be called which consumes considerable resources. The purpose of this study was to determine the frequency and characteristics of code blue calls for seizures and seizure mimickers. A retrospective review of a code blue log from 2001 through 2008 identified 50 seizure-like events, representing 5.3% of all codes. Twenty-eight (54%) occurred in inpatients; the other 22 (44%) events involved visitors or employees on the hospital premises. Eighty-six percent of the events were epileptic seizures. Seizure mimickers, particularly psychogenic nonepileptic seizures, were more common in the nonhospitalized group. Only five (17.9%) inpatients had a known diagnosis of epilepsy, compared with 17 (77.3%) of the nonhospitalized patients. This retrospective survey provides insights into how code blues are called on hospitalized versus nonhospitalized patients for seizure-like events. PMID:21546315

  12. Combustion chamber analysis code

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-01-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  13. Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; DuPrie, K.; Berriman, B.; Hanisch, R. J.; Mink, J.; Teuben, P. J.

    2013-10-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, is a free on-line registry for source codes of interest to astronomers and astrophysicists. The library is housed on the discussion forum for Astronomy Picture of the Day (APOD) and can be accessed at http://ascl.net. The ASCL has a comprehensive listing that covers a significant number of the astrophysics source codes used to generate results published in or submitted to refereed journals and continues to grow. The ASCL currently has entries for over 500 codes; its records are citable and are indexed by ADS. The editors of the ASCL and members of its Advisory Committee were on hand at a demonstration table in the ADASS poster room to present the ASCL, accept code submissions, show how the ASCL is starting to be used by the astrophysics community, and take questions on and suggestions for improving the resource.

  14. Narrative compression coding for a channel with errors

    NASA Astrophysics Data System (ADS)

    Bond, James W.

    1988-12-01

    Data compression codes offer the possibility of improving the throughput of existing communication systems in the near term. This study was undertaken to determine if data compression codes could be utilized to provide message compression in a channel with up to a .10 bit error rate. The data compression capabilities of codes were investigated by estimating the average number of bits-per-character required to transmit narrative files. The performance of the codes in a channel with errors (a noisy channel) was investigated in terms of the average numbers of characters decoded in error per bit error and of characters printed in error per bit error. Results were obtained by encoding four narrative files, which were resident on an IBM PC and use a 58 character set. The study focused on Huffman codes and suffix/prefix comma-free codes. Other data compression codes, in particular, block codes and some simple variants of block codes, are briefly discussed to place the study results in context. Comma-free codes were found to have the most promising data compression because error propagation due to bit errors are limited to a few characters for these codes. A technique was found to identify a suffix/prefix comma-free code giving nearly the same data compression as a Huffman code with much less error propagation than the Huffman codes.

  15. Performance of concatenated Reed-Solomon trellis-coded modulation over Rician fading channels

    NASA Technical Reports Server (NTRS)

    Moher, Michael L.; Lodge, John H.

    1990-01-01

    A concatenated coding scheme for providing very reliable data over mobile-satellite channels at power levels similar to those used for vocoded speech is described. The outer code is a shorter Reed-Solomon code which provides error detection as well as error correction capabilities. The inner code is a 1-D 8-state trellis code applied independently to both the inphase and quadrature channels. To achieve the full error correction potential of this inner code, the code symbols are multiplexed with a pilot sequence which is used to provide dynamic channel estimation and coherent detection. The implementation structure of this scheme is discussed and its performance is estimated.

  16. Character coding of secondary chemical variation for use in phylogenetic analyses.

    PubMed

    Barkman

    2001-01-01

    A coding procedure is presented for secondary chemical data whereby putative biogenetic pathways are coded as phylogenetic characters with enzymatic conversions between compounds representing the corresponding character states. A character state tree or stepmatrix allows direct representation of the secondary chemical biogenetic pathway and avoids problems of non-independence associated with coding schemes that score presence/absence of individual compounds. Stepmatrices are the most biosynthetically realistic character definitions because individual and population level polymorphisms can be scored, reticulate enzymatic conversions within pathways may be represented, and down-weighting of pathway loss versus gain is possible. The stepmatrix approach unifies analyses of secondary chemicals, allozymes, and developmental characters because the biological unity of the pathway, locus, or character ontogeny is preserved. Empirical investigation of the stepmatrix and character state tree coding methods using floral fragrance data in Cypripedium (Orchidaceae) resulted in cladistic relationships which were largely congruent with those suggested from recent DNA and allozyme studies. This character coding methodology provides an effective means for including secondary compound data in total evidence studies. Furthermore, ancestral state reconstructions provide a phylogenetic context within which biochemical pathway evolution may be studied. PMID:11068120

  17. A distributed code for color in natural scenes derived from center-surround filtered cone signals

    PubMed Central

    Kellner, Christian J.; Wachtler, Thomas

    2013-01-01

    In the retina of trichromatic primates, chromatic information is encoded in an opponent fashion and transmitted to the lateral geniculate nucleus (LGN) and visual cortex via parallel pathways. Chromatic selectivities of neurons in the LGN form two separate clusters, corresponding to two classes of cone opponency. In the visual cortex, however, the chromatic selectivities are more distributed, which is in accordance with a population code for color. Previous studies of cone signals in natural scenes typically found opponent codes with chromatic selectivities corresponding to two directions in color space. Here we investigated how the non-linear spatio-chromatic filtering in the retina influences the encoding of color signals. Cone signals were derived from hyper-spectral images of natural scenes and preprocessed by center-surround filtering and rectification, resulting in parallel ON and OFF channels. Independent Component Analysis (ICA) on these signals yielded a highly sparse code with basis functions that showed spatio-chromatic selectivities. In contrast to previous analyses of linear transformations of cone signals, chromatic selectivities were not restricted to two main chromatic axes, but were more continuously distributed in color space, similar to the population code of color in the early visual cortex. Our results indicate that spatio-chromatic processing in the retina leads to a more distributed and more efficient code for natural scenes. PMID:24098289

  18. Distributed transform coding via source-splitting

    NASA Astrophysics Data System (ADS)

    Yahampath, Pradeepa

    2012-12-01

    Transform coding (TC) is one of the best known practical methods for quantizing high-dimensional vectors. In this article, a practical approach to distributed TC of jointly Gaussian vectors is presented. This approach, referred to as source-split distributed transform coding (SP-DTC), can be used to easily implement two terminal transform codes for any given rate-pair. The main idea is to apply source-splitting using orthogonal-transforms, so that only Wyner-Ziv (WZ) quantizers are required for compression of transform coefficients. This approach however requires optimizing the bit allocation among dependent sets of WZ quantizers. In order to solve this problem, a low-complexity tree-search algorithm based on analytical models for transform coefficient quantization is developed. A rate-distortion (RD) analysis of SP-DTCs for jointly Gaussian sources is presented, which indicates that these codes can significantly outperform the practical alternative of independent TC of each source, whenever there is a strong correlation between the sources. For practical implementation of SP-DTCs, the idea of using conditional entropy constrained (CEC) quantizers followed by Slepian-Wolf coding is explored. Experimental results obtained with SP-DTC designs based on both CEC scalar quantizers and CEC trellis-coded quantizers demonstrate that actual implementations of SP-DTCs can achieve RD performance close to the analytically predicted limits.

  19. Value of Laboratory Experiments for Code Validations

    SciTech Connect

    Wawersik, W.R.

    1998-12-14

    Numerical codes have become indispensable for designing underground structures and interpretating the behavior of geologic systems. Because of the complexities of geologic systems, however, code calculations often are associated with large quantitative uncertainties. This papers presents three examples to demonstrate the value of laboratory(or bench scale) experiments to evaluate the predictive capabilities of such codes with five major conclusions: Laboratory or bench-scale experiments are a very cost-effective, controlled means of evaluating and validating numerical codes, not instead of but before or at least concurrent with the implementation of in situ studies. The design of good laboratory validation tests must identifj what aspects of a code are to be scrutinized in order to optimize the size, geometry, boundary conditions, and duration of the experiments. The design of good and sometimes difficult numerical analyses and sensitivity studies. Laboratory validation tests must involve: Good validation experiments will generate independent data sets to identify the combined effect of constitutive models, model generalizations, material parameters, and numerical algorithms. Successfid validations of numerical codes mandate a close collaboration between experimentalists and analysts drawing from the full gamut of observations, measurements, and mathematical results.

  20. A robust low-rate coding scheme for packet video

    NASA Technical Reports Server (NTRS)

    Chen, Y. C.; Sayood, Khalid; Nelson, D. J.; Arikan, E. (Editor)

    1991-01-01

    Due to the rapidly evolving field of image processing and networking, video information promises to be an important part of telecommunication systems. Although up to now video transmission has been transported mainly over circuit-switched networks, it is likely that packet-switched networks will dominate the communication world in the near future. Asynchronous transfer mode (ATM) techniques in broadband-ISDN can provide a flexible, independent and high performance environment for video communication. For this paper, the network simulator was used only as a channel in this simulation. Mixture blocking coding with progressive transmission (MBCPT) has been investigated for use over packet networks and has been found to provide high compression rate with good visual performance, robustness to packet loss, tractable integration with network mechanics and simplicity in parallel implementation.

  1. Pyramid image codes

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1990-01-01

    All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

  2. Report number codes

    SciTech Connect

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  3. Local intensity adaptive image coding

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.

    1989-01-01

    The objective of preprocessing for machine vision is to extract intrinsic target properties. The most important properties ordinarily are structure and reflectance. Illumination in space, however, is a significant problem as the extreme range of light intensity, stretching from deep shadow to highly reflective surfaces in direct sunlight, impairs the effectiveness of standard approaches to machine vision. To overcome this critical constraint, an image coding scheme is being investigated which combines local intensity adaptivity, image enhancement, and data compression. It is very effective under the highly variant illumination that can exist within a single frame or field of view, and it is very robust to noise at low illuminations. Some of the theory and salient features of the coding scheme are reviewed. Its performance is characterized in a simulated space application, the research and development activities are described.

  4. Marketing Handbook for Independent Schools.

    ERIC Educational Resources Information Center

    Boarding Schools, Boston, MA.

    This publication is a resource to help independent schools attract more familites to their institutions and to increase the voluntary support by the larger community surrounding the school. The first chapter attempts to dispel misconceptions, define pertinent marketing terms, and relate their importance to independent schools. The rest of the book…

  5. Independent Learning Models: A Comparison.

    ERIC Educational Resources Information Center

    Wickett, R. E. Y.

    Five models of independent learning are suitable for use in adult education programs. The common factor is a facilitator who works in some way with the student in the learning process. They display different characteristics, including the extent of independence in relation to content and/or process. Nondirective tutorial instruction and learning…

  6. Honor Codes and Other Contextual Influences on Academic Integrity: A Replication and Extension to Modified Honor Code Settings.

    ERIC Educational Resources Information Center

    McCabe, Donald L.; Trevino, Linda Klebe; Butterfield, Kenneth D.

    2002-01-01

    Investigated the influence of modified honor codes, an alternative to traditional codes that is gaining popularity on larger campuses. Also tested the model of student academic dishonesty previously suggested by McCabe and Trevino. Found that modified honor codes are associated with lower levels of student dishonesty and that the McCabe Trevino…

  7. Applications of numerical codes to space plasma problems

    NASA Technical Reports Server (NTRS)

    Northrop, T. G.; Birmingham, T. J.; Jones, F. C.; Wu, C. S.

    1975-01-01

    Solar wind, earth's bowshock, and magnetospheric convection and substorms were investigated. Topics discussed include computational physics, multifluid codes, ionospheric irregularities, and modeling laser plasmas.

  8. Parallel CARLOS-3D code development

    SciTech Connect

    Putnam, J.M.; Kotulski, J.D.

    1996-02-01

    CARLOS-3D is a three-dimensional scattering code which was developed under the sponsorship of the Electromagnetic Code Consortium, and is currently used by over 80 aerospace companies and government agencies. The code has been extensively validated and runs on both serial workstations and parallel super computers such as the Intel Paragon. CARLOS-3D is a three-dimensional surface integral equation scattering code based on a Galerkin method of moments formulation employing Rao- Wilton-Glisson roof-top basis for triangular faceted surfaces. Fully arbitrary 3D geometries composed of multiple conducting and homogeneous bulk dielectric materials can be modeled. This presentation describes some of the extensions to the CARLOS-3D code, and how the operator structure of the code facilitated these improvements. Body of revolution (BOR) and two-dimensional geometries were incorporated by simply including new input routines, and the appropriate Galerkin matrix operator routines. Some additional modifications were required in the combined field integral equation matrix generation routine due to the symmetric nature of the BOR and 2D operators. Quadrilateral patched surfaces with linear roof-top basis functions were also implemented in the same manner. Quadrilateral facets and triangular facets can be used in combination to more efficiently model geometries with both large smooth surfaces and surfaces with fine detail such as gaps and cracks. Since the parallel implementation in CARLOS-3D is at high level, these changes were independent of the computer platform being used. This approach minimizes code maintenance, while providing capabilities with little additional effort. Results are presented showing the performance and accuracy of the code for some large scattering problems. Comparisons between triangular faceted and quadrilateral faceted geometry representations will be shown for some complex scatterers.

  9. Strong independent association between obesity and essential hypertension.

    PubMed

    Movahed, M R; Lee, J Z; Lim, W Y; Hashemzadeh, M; Hashemzadeh, M

    2016-06-01

    Obesity and hypertension (HTN) are major risk factors for cardiovascular disease. Association between obesity and HTN has not been studied in a large populations following adjustment for comorbidities. The goal of this study was to evaluate any association between obesity and HTN after adjusting for baseline characteristics. We used ICD-9 codes for obesity and HTN from the Nationwide Inpatient Sample (NIS) databases. Two randomly selected years, 1992 and 2002, were chosen from the databases as two independent samples. We used uni- and multivariable analysis to study any correlation between obesity and HTN. The 1992 database contained a total of 6,195,744 patients. HTN was present in 37.2 % of patients with obesity versus 12% of the control group (OR: 4.36, CI 4.30-4.42, P < 0.001). The 2002 database contained a total of 7,153,982 patients. HTN was present in 50.7% of patients with obesity versus 25.6% of the control group (OR: 2.98, CI 2.96-3.00, P < 0.001). Using multivariable analysis adjusting for gender, hyperlipidaemia, age, smoking, type 2 diabetes and chronic renal failure, obesity remained correlated with HTN in both years (1992: OR 2.69, CI 2.67-2.72, P < 0.001; 2002: OR 2.98, CI 2.96-3.00, P < 0.001). The presence of obesity was found to be strongly and independently associated with HTN. The cause of this correlation is not known warranting further investigation. PMID:27166134

  10. Ideal Binocular Disparity Detectors Learned Using Independent Subspace Analysis on Binocular Natural Image Pairs

    PubMed Central

    Hunter, David W.; Hibbard, Paul B.

    2016-01-01

    An influential theory of mammalian vision, known as the efficient coding hypothesis, holds that early stages in the visual cortex attempts to form an efficient coding of ecologically valid stimuli. Although numerous authors have successfully modelled some aspects of early vision mathematically, closer inspection has found substantial discrepancies between the predictions of some of these models and observations of neurons in the visual cortex. In particular analysis of linear-non-linear models of simple-cells using Independent Component Analysis has found a strong bias towards features on the horoptor. In order to investigate the link between the information content of binocular images, mathematical models of complex cells and physiological recordings, we applied Independent Subspace Analysis to binocular image patches in order to learn a set of complex-cell-like models. We found that these complex-cell-like models exhibited a wide range of binocular disparity-discriminability, although only a minority exhibited high binocular discrimination scores. However, in common with the linear-non-linear model case we found that feature detection was limited to the horoptor suggesting that current mathematical models are limited in their ability to explain the functionality of the visual cortex. PMID:26982184

  11. Trellis coding with multidimensional QAM signal sets

    NASA Technical Reports Server (NTRS)

    Pietrobon, Steven S.; Costello, Daniel J.

    1993-01-01

    Trellis coding using multidimensional QAM signal sets is investigated. Finite-size 2D signal sets are presented that have minimum average energy, are 90-deg rotationally symmetric, and have from 16 to 1024 points. The best trellis codes using the finite 16-QAM signal set with two, four, six, and eight dimensions are found by computer search (the multidimensional signal set is constructed from the 2D signal set). The best moderate complexity trellis codes for infinite lattices with two, four, six, and eight dimensions are also found. The minimum free squared Euclidean distance and number of nearest neighbors for these codes were used as the selection criteria. Many of the multidimensional codes are fully rotationally invariant and give asymptotic coding gains up to 6.0 dB. From the infinite lattice codes, the best codes for transmitting J, J + 1/4, J + 1/3, J + 1/2, J + 2/3, and J + 3/4 bit/sym (J an integer) are presented.

  12. INVESTIGATION OF FISCALLY INDEPENDENT AND DEPENDENT CITY SCHOOL DISTRICTS.

    ERIC Educational Resources Information Center

    GITTELL, MARILYN; AND OTHERS

    A TWO-PART COMPARATIVE ANALYSIS IS MADE OF LARGE AND SMALL CITY SCHOOL SYSTEMS. PART I ANALYZES A WIDE RANGE OF FISCAL AND NON-FISCAL VARIABLES ASSOCIATED WITH FISCAL STATUS OF CITY SCHOOL SYSTEMS. IT COVERS THE 2,788 CITY SCHOOL DISTRICTS IN THE UNITED STATES WITH ENROLLMENTS OVER 3,000. COMPLEX INTERRELATIONSHIPS SURROUNDING FISCAL STATUS IN

  13. FORTRAN code-evaluation system

    NASA Technical Reports Server (NTRS)

    Capps, J. D.; Kleir, R.

    1977-01-01

    Automated code evaluation system can be used to detect coding errors and unsound coding practices in any ANSI FORTRAN IV source code before they can cause execution-time malfunctions. System concentrates on acceptable FORTRAN code features which are likely to produce undesirable results.

  14. Compressible Astrophysics Simulation Code

    Energy Science and Technology Software Center (ESTSC)

    2007-07-18

    This is an astrophysics simulation code involving a radiation diffusion module developed at LLNL coupled to compressible hydrodynamics and adaptive mesh infrastructure developed at LBNL. One intended application is to neutrino diffusion in core collapse supernovae.

  15. Multi-level bandwidth efficient block modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1989-01-01

    The multilevel technique is investigated for combining block coding and modulation. There are four parts. In the first part, a formulation is presented for signal sets on which modulation codes are to be constructed. Distance measures on a signal set are defined and their properties are developed. In the second part, a general formulation is presented for multilevel modulation codes in terms of component codes with appropriate Euclidean distances. The distance properties, Euclidean weight distribution and linear structure of multilevel modulation codes are investigated. In the third part, several specific methods for constructing multilevel block modulation codes with interdependency among component codes are proposed. Given a multilevel block modulation code C with no interdependency among the binary component codes, the proposed methods give a multilevel block modulation code C which has the same rate as C, a minimum squared Euclidean distance not less than that of code C, a trellis diagram with the same number of states as that of C and a smaller number of nearest neighbor codewords than that of C. In the last part, error performance of block modulation codes is analyzed for an AWGN channel based on soft-decision maximum likelihood decoding. Error probabilities of some specific codes are evaluated based on their Euclidean weight distributions and simulation results.

  16. Fire investigation

    NASA Astrophysics Data System (ADS)

    Gomberg, A.

    There was considerable progress made on several fronts of fire investigation in the United States in recent years. Progress was made in increasing the quantity of fire investigation and reporting, through efforts to develop the National Fire Incident Reporting System. Improving overall quality of fire investigation is the objective of efforts such as the Fire Investigation Handbook, which was developed and published by the National Bureau of Standards, and the upgrading and expanding of the ""dictionary'' of fire investigation and reporting, the NFPA 901, Uniform Coding for Fire Protection, system. The science of fire investigation as furthered also by new approaches to post fire interviews being developed at the University of Washington, and by in-depth research into factors involved in several large loss fires, including the MGM Grand Hotel in Las Vegas. Finally, the use of special study fire investigations - in-depth investigations concentrating on specific fire problems - is producing new glimpses into the nature of the national fire problem. A brief description of the status of efforts in each of these areas is discussed.

  17. ORECA CODE ASSESSMENT.

    SciTech Connect

    KROEGER,P.G.

    1980-07-01

    Results of an assessment of the ORECA code are being presented. In particular it was found that in the case of loss of forced flow circulation the predicted peak core temperatures are very sensitive to the mean gas temperatures used in the evaluation of the pressure drop terms. Some potential shortcomings of the conduction algorithm for some specific applications are discussed. The results of these efforts have been taken into consideration in the current version of the ORECA code.

  18. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  19. Knowledge and Performance about Nursing Ethic Codes from Nurses' and Patients' Perspective in Tabriz Teaching Hospitals, Iran

    PubMed Central

    Mohajjel-Aghdam, Alireza; Hassankhani, Hadi; Zamanzadeh, Vahid; Khameneh, Saied; Moghaddam, Sara

    2013-01-01

    Introduction: Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective. Methods: A descriptive study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13. Results: Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. Conclusion: According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives. PMID:25276730

  20. Population coding of affect across stimuli, modalities and individuals

    PubMed Central

    Chikazoe, Junichi; Lee, Daniel H.; Kriegeskorte, Nikolaus; Anderson, Adam K.

    2014-01-01

    It remains unclear how the brain represents external objective sensory events alongside our internal subjective impressions of them—affect. Representational mapping of population level activity evoked by complex scenes and basic tastes uncovered a neural code supporting a continuous axis of pleasant-to-unpleasant valence. This valence code was distinct from low-level physical and high-level object properties. While ventral temporal and anterior insular cortices supported valence codes specific to vision and taste, both the medial and lateral orbitofrontal cortices (OFC), maintained a valence code independent of sensory origin. Further only the OFC code could classify experienced affect across participants. The entire valence spectrum is represented as a collective pattern in regional neural activity as sensory-specific and abstract codes, whereby the subjective quality of affect can be objectively quantified across stimuli, modalities, and people. PMID:24952643

  1. Robust Nonlinear Neural Codes

    NASA Astrophysics Data System (ADS)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  2. Sequence independent amplification of DNA

    DOEpatents

    Bohlander, Stefan K.

    1998-01-01

    The present invention is a rapid sequence-independent amplification procedure (SIA). Even minute amounts of DNA from various sources can be amplified independent of any sequence requirements of the DNA or any a priori knowledge of any sequence characteristics of the DNA to be amplified. This method allows, for example the sequence independent amplification of microdissected chromosomal material and the reliable construction of high quality fluorescent in situ hybridization (FISH) probes from YACs or from other sources. These probes can be used to localize YACs on metaphase chromosomes but also--with high efficiency--in interphase nuclei.

  3. Sequence independent amplification of DNA

    DOEpatents

    Bohlander, S.K.

    1998-03-24

    The present invention is a rapid sequence-independent amplification procedure (SIA). Even minute amounts of DNA from various sources can be amplified independent of any sequence requirements of the DNA or any a priori knowledge of any sequence characteristics of the DNA to be amplified. This method allows, for example, the sequence independent amplification of microdissected chromosomal material and the reliable construction of high quality fluorescent in situ hybridization (FISH) probes from YACs or from other sources. These probes can be used to localize YACs on metaphase chromosomes but also--with high efficiency--in interphase nuclei. 25 figs.

  4. Biographical factors of occupational independence.

    PubMed

    Müller, G F

    2001-10-01

    The present study examined biographical factors of occupational independence including any kind of nonemployed profession. Participants were 59 occupationally independent and 58 employed persons of different age (M = 36.3 yr.), sex, and profession. They were interviewed on variables like family influence, educational background, occupational role models, and critical events for choosing a particular type of occupational career. The obtained results show that occupationally independent people reported stronger family ties, experienced fewer restrictions of formal education, and remembered fewer negative role models than the employed people. Implications of these results are discussed. PMID:11783553

  5. Coded aperture compressive temporal imaging.

    PubMed

    Llull, Patrick; Liao, Xuejun; Yuan, Xin; Yang, Jianbo; Kittle, David; Carin, Lawrence; Sapiro, Guillermo; Brady, David J

    2013-05-01

    We use mechanical translation of a coded aperture for code division multiple access compression of video. We discuss the compressed video's temporal resolution and present experimental results for reconstructions of > 10 frames of temporal data per coded snapshot. PMID:23669910

  6. Parental Beliefs about Emotions Are Associated with Early Adolescents' Independent and Interdependent Self-Construals

    ERIC Educational Resources Information Center

    Her, Pa; Dunsmore, Julie C.

    2011-01-01

    We assessed linkages between parents' beliefs and their children's self-construals with 60 7th and 8th graders. Early adolescents completed an open-ended, Self-Guide Questionnaire and an independent and interdependent reaction-time measure. The self-guide responses were coded for independent and interdependent traits. Parents reported beliefs…

  7. Reversibility and efficiency in coding protein information.

    PubMed

    Tamir, Boaz; Priel, Avner

    2010-12-21

    Why the genetic code has a fixed length? Protein information is transferred by coding each amino acid using codons whose length equals 3 for all amino acids. Hence the most probable and the least probable amino acid get a codeword with an equal length. Moreover, the distributions of amino acids found in nature are not uniform and therefore the efficiency of such codes is sub-optimal. The origins of these apparently non-efficient codes are yet unclear. In this paper we propose an a priori argument for the energy efficiency of such codes resulting from their reversibility, in contrast to their time inefficiency. Such codes are reversible in the sense that a primitive processor, reading three letters in each step, can always reverse its operation, undoing its process. We examine the codes for the distributions of amino acids that exist in nature and show that they could not be both time efficient and reversible. We investigate a family of Zipf-type distributions and present their efficient (non-fixed length) prefix code, their graphs, and the condition for their reversibility. We prove that for a large family of such distributions, if the code is time efficient, it could not be reversible. In other words, if pre-biotic processes demand reversibility, the protein code could not be time efficient. The benefits of reversibility are clear: reversible processes are adiabatic, namely, they dissipate a very small amount of energy. Such processes must be done slowly enough; therefore time efficiency is non-important. It is reasonable to assume that early biochemical complexes were more prone towards energy efficiency, where forward and backward processes were almost symmetrical. PMID:20868696

  8. Spherical hashing: binary code embedding with hyperspheres.

    PubMed

    Heo, Jae-Pil; Lee, Youngwoon; He, Junfeng; Chang, Shih-Fu; Yoon, Sung-Eui

    2015-11-01

    Many binary code embedding schemes have been actively studied recently, since they can provide efficient similarity search, and compact data representations suitable for handling large scale image databases. Existing binary code embedding techniques encode high-dimensional data by using hyperplane-based hashing functions. In this paper we propose a novel hypersphere-based hashing function, spherical hashing, to map more spatially coherent data points into a binary code compared to hyperplane-based hashing functions. We also propose a new binary code distance function, spherical Hamming distance, tailored for our hypersphere-based binary coding scheme, and design an efficient iterative optimization process to achieve both balanced partitioning for each hash function and independence between hashing functions. Furthermore, we generalize spherical hashing to support various similarity measures defined by kernel functions. Our extensive experiments show that our spherical hashing technique significantly outperforms state-of-the-art techniques based on hyperplanes across various benchmarks with sizes ranging from one to 75 million of GIST, BoW and VLAD descriptors. The performance gains are consistent and large, up to 100 percent improvements over the second best method among tested methods. These results confirm the unique merits of using hyperspheres to encode proximity regions in high-dimensional spaces. Finally, our method is intuitive and easy to implement. PMID:26440269

  9. Code Seal v 1.0

    Energy Science and Technology Software Center (ESTSC)

    2009-12-11

    CodeSeal is a Sandia National Laboratories developed technology that provides a means of securely obfuscating finite state machines in a mathematically provable way. The technology was developed in order to provide a solution for anti-reverse engineering, assured execution, and integrity of execution. CodeSeal accomplishes these goals with the addition of the concept of a trust anchor, a small piece of trust integrated into the system, to the model of code obfuscation. Code obfuscation is anmore » active area of academic research, but most findings have merely demonstrated that general obfuscation is impossible. By modifying the security model such that we may rely on the presence of a small, tamper-protected device, however, Sandia has developed an effective method for obfuscating code. An open publication describing the technology in more detail can be found at http://eprint.iacr.org/2008/184.pdf.Independent Software/Hardware monitors, Use control, Supervisory Control And Data Acquisition (SCADA), Algorithm obfuscation« less

  10. Code Seal v 1.0

    SciTech Connect

    Chavez, Adrian; & Anderson, William

    2009-12-11

    CodeSeal is a Sandia National Laboratories developed technology that provides a means of securely obfuscating finite state machines in a mathematically provable way. The technology was developed in order to provide a solution for anti-reverse engineering, assured execution, and integrity of execution. CodeSeal accomplishes these goals with the addition of the concept of a trust anchor, a small piece of trust integrated into the system, to the model of code obfuscation. Code obfuscation is an active area of academic research, but most findings have merely demonstrated that general obfuscation is impossible. By modifying the security model such that we may rely on the presence of a small, tamper-protected device, however, Sandia has developed an effective method for obfuscating code. An open publication describing the technology in more detail can be found at http://eprint.iacr.org/2008/184.pdf.Independent Software/Hardware monitors, Use control, Supervisory Control And Data Acquisition (SCADA), Algorithm obfuscation

  11. Independent Schools: Landscape and Learnings.

    ERIC Educational Resources Information Center

    Oates, William A.

    1981-01-01

    Examines American independent schools (parochial, southern segregated, and private institutions) in terms of their funding, expenditures, changing enrollment patterns, teacher-student ratios, and societal functions. Journal available from Daedalus Subscription Department, 1172 Commonwealth Ave., Boston, MA 02132. (AM)

  12. Technology for Independent Living: Sourcebook.

    ERIC Educational Resources Information Center

    Enders, Alexandra, Ed.

    This sourcebook provides information for the practical implementation of independent living technology in the everyday rehabilitation process. "Information Services and Resources" lists databases, clearinghouses, networks, research and development programs, toll-free telephone numbers, consumer protection caveats, selected publications, and…

  13. Minimum description length (MDL)-based arithmetic coding for correlated Markov states

    NASA Astrophysics Data System (ADS)

    Matsushiro, Nobuhito; Asada, Osamu

    1995-03-01

    In a description of a data by using arithmetic coding, coding parameters are estimated for each markov state independently. However, markov states are not completely uncorrelated in many cases. By utilizing the correlation, we can decrease estimation errors of coding parameters and improves compression performance. The utilization of the correlation can be included in the MDL (Minimum Description Length) framework. We have developed an MDL-based arithmetic coding for correlated markov states.

  14. Prioritized LT Codes

    NASA Technical Reports Server (NTRS)

    Woo, Simon S.; Cheng, Michael K.

    2011-01-01

    The original Luby Transform (LT) coding scheme is extended to account for data transmissions where some information symbols in a message block are more important than others. Prioritized LT codes provide unequal error protection (UEP) of data on an erasure channel by modifying the original LT encoder. The prioritized algorithm improves high-priority data protection without penalizing low-priority data recovery. Moreover, low-latency decoding is also obtained for high-priority data due to fast encoding. Prioritized LT codes only require a slight change in the original encoding algorithm, and no changes at all at the decoder. Hence, with a small complexity increase in the LT encoder, an improved UEP and low-decoding latency performance for high-priority data can be achieved. LT encoding partitions a data stream into fixed-sized message blocks each with a constant number of information symbols. To generate a code symbol from the information symbols in a message, the Robust-Soliton probability distribution is first applied in order to determine the number of information symbols to be used to compute the code symbol. Then, the specific information symbols are chosen uniform randomly from the message block. Finally, the selected information symbols are XORed to form the code symbol. The Prioritized LT code construction includes an additional restriction that code symbols formed by a relatively small number of XORed information symbols select some of these information symbols from the pool of high-priority data. Once high-priority data are fully covered, encoding continues with the conventional LT approach where code symbols are generated by selecting information symbols from the entire message block including all different priorities. Therefore, if code symbols derived from high-priority data experience an unusual high number of erasures, Prioritized LT codes can still reliably recover both high- and low-priority data. This hybrid approach decides not only "how to encode" but also "what to encode" to achieve UEP. Another advantage of the priority encoding process is that the majority of high-priority data can be decoded sooner since only a small number of code symbols are required to reconstruct high-priority data. This approach increases the likelihood that high-priority data is decoded first over low-priority data. The Prioritized LT code scheme achieves an improvement in high-priority data decoding performance as well as overall information recovery without penalizing the decoding of low-priority data, assuming high-priority data is no more than half of a message block. The cost is in the additional complexity required in the encoder. If extra computation resource is available at the transmitter, image, voice, and video transmission quality in terrestrial and space communications can benefit from accurate use of redundancy in protecting data with varying priorities.

  15. Effective Practice in the Design of Directed Independent Learning Opportunities

    ERIC Educational Resources Information Center

    Thomas, Liz; Jones, Robert; Ottaway, James

    2015-01-01

    This study, commissioned by the HEA and the QAA focuses on directed independent learning practices in UK higher education. It investigates what stakeholders (including academic staff and students) have found to be the most effective practices in the inception, design, quality assurance and enhancement of directed independent learning and explores…

  16. Error coding simulations

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1993-01-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  17. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  18. Codes with multi-level error-correcting capabilities

    NASA Technical Reports Server (NTRS)

    Lin, Mao-Chao; Lin, Shu

    1990-01-01

    In conventional channel coding, all the information symbols of a message are regarded equally significant, and hence codes are devised to provide equal protection for each information symbol against channel errors. However, in some circumstances, some information symbols in a message are more significant than the other symbols. As a result, it is desirable to devise codes with multilevel error-correcting capabilities. In this paper, block codes with multilevel error correcting capabilities, which are also known as unequal error protection (UEP) codes, are investigated. Several classes of UEP codes are constructed. One class of codes satisfies the Hamming bound on the number of parity-check symbols for systematic linear UEP codes and hence is optimal.

  19. High-Speed Soft-Decision Decoding of Two Reed-Muller Codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Uehara, Gregory T.

    1996-01-01

    In this research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing, a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RM subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second (Mbps). The combination of a large number of states and a high data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study, which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high-speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating, and present some of the key architectural approaches being used to implement the system at high speed. Second, we will describe details of the 8-trellis diagram we found to best meet the trade-offs between chip and overall system complexity. The chosen approach implements the trellis for the (64, 40, 8) RM subcode with 32 independent sub-trellises. And third, we will describe results of our feasibility study on the implementation of such an IC chip in CMOS technology to implement one of these sub-trellises.

  20. High-Speed Soft-Decision Decoding of Two Reed-Muller Codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Uehara, Gregory T.

    1996-01-01

    In his research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RNI subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second (Mbps). The combination of a large number of states and a hi ch data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high-speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating and present some of the key architectural approaches being used to implement the system at high speed. Second, we will describe details of the 8-trellis diagram we found to best meet the trade-offs between chip and overall system complexity. The chosen approach implements the trellis for the (64, 40, 8) RM subcode with 32 independent sub-trellises. And third, we will describe results of our feasibility study on the implementation of such an IC chip in CMOS technology to implement one of these sub-trellises.

  1. Photonic security system using spatial codes and remote coded coherent optical communications

    NASA Astrophysics Data System (ADS)

    Riza, Nabeel A.; Howlader, Mohammad M.; Madamopoulos, Nicholas

    1996-09-01

    A novel photonic security system is describe using 2D spatial codes based on both optical phase and amplitude information. This security system consists of an optical interferometric encoding subsystem that rapidly reads and encodes the 2D complex-valued spatial code, forming a wideband frequency modulated optical beam and a colinear optical reference beam. After appropriate coherence coding of this beam pair, the light is launched into a low probability of intercept communication channel such as an optical fiber or a narrow beamwidth free-space optical ink. At the remote code receiving and data processing site, the received light beam pair is first coherently decoded. Then, high speed photodetector via optical heterodyne detection generates an encoded wideband radio frequency signal that contains the original 2D code. Decoding is implemented in parallel via two independent systems. One decode uses a Fourier transforming lens to reconstruct an electronic image interferogram of the complex-valued user code. This image interferogram is sent to a high speed electronic image processor for verification purposes. The other decoder is a high speed coherent acousto-optic time integrating correlator that optically determines match-mismatch between the received encoded signal and the code signal generated by the electronic database. Improved security to the overall communication network is added by using various keycodes such as a time varying keycode that determines the exact spatial beam scanning sequence required for both proper encoding and decoding of the 2D code information. This paper describes preliminary experiments using a simple 1D amplitude modulated spatial code.

  2. Identifying personal microbiomes using metagenomic codes

    PubMed Central

    Franzosa, Eric A.; Huang, Katherine; Meadow, James F.; Gevers, Dirk; Lemon, Katherine P.; Bohannan, Brendan J. M.; Huttenhower, Curtis

    2015-01-01

    Community composition within the human microbiome varies across individuals, but it remains unknown if this variation is sufficient to uniquely identify individuals within large populations or stable enough to identify them over time. We investigated this by developing a hitting set-based coding algorithm and applying it to the Human Microbiome Project population. Our approach defined body site-specific metagenomic codes: sets of microbial taxa or genes prioritized to uniquely and stably identify individuals. Codes capturing strain variation in clade-specific marker genes were able to distinguish among 100s of individuals at an initial sampling time point. In comparisons with follow-up samples collected 30–300 d later, ∼30% of individuals could still be uniquely pinpointed using metagenomic codes from a typical body site; coincidental (false positive) matches were rare. Codes based on the gut microbiome were exceptionally stable and pinpointed >80% of individuals. The failure of a code to match its owner at a later time point was largely explained by the loss of specific microbial strains (at current limits of detection) and was only weakly associated with the length of the sampling interval. In addition to highlighting patterns of temporal variation in the ecology of the human microbiome, this work demonstrates the feasibility of microbiome-based identifiability—a result with important ethical implications for microbiome study design. The datasets and code used in this work are available for download from huttenhower.sph.harvard.edu/idability. PMID:25964341

  3. Identifying personal microbiomes using metagenomic codes.

    PubMed

    Franzosa, Eric A; Huang, Katherine; Meadow, James F; Gevers, Dirk; Lemon, Katherine P; Bohannan, Brendan J M; Huttenhower, Curtis

    2015-06-01

    Community composition within the human microbiome varies across individuals, but it remains unknown if this variation is sufficient to uniquely identify individuals within large populations or stable enough to identify them over time. We investigated this by developing a hitting set-based coding algorithm and applying it to the Human Microbiome Project population. Our approach defined body site-specific metagenomic codes: sets of microbial taxa or genes prioritized to uniquely and stably identify individuals. Codes capturing strain variation in clade-specific marker genes were able to distinguish among 100s of individuals at an initial sampling time point. In comparisons with follow-up samples collected 30-300 d later, ∼30% of individuals could still be uniquely pinpointed using metagenomic codes from a typical body site; coincidental (false positive) matches were rare. Codes based on the gut microbiome were exceptionally stable and pinpointed >80% of individuals. The failure of a code to match its owner at a later time point was largely explained by the loss of specific microbial strains (at current limits of detection) and was only weakly associated with the length of the sampling interval. In addition to highlighting patterns of temporal variation in the ecology of the human microbiome, this work demonstrates the feasibility of microbiome-based identifiability-a result with important ethical implications for microbiome study design. The datasets and code used in this work are available for download from huttenhower.sph.harvard.edu/idability. PMID:25964341

  4. Phase-coded pulse aperiodic transmitter coding

    NASA Astrophysics Data System (ADS)

    Virtanen, I. I.; Vierinen, J.; Lehtinen, M. S.

    2009-07-01

    Both ionospheric and weather radar communities have already adopted the method of transmitting radar pulses in an aperiodic manner when measuring moderately overspread targets. Among the users of the ionospheric radars, this method is called Aperiodic Transmitter Coding (ATC), whereas the weather radar users have adopted the term Simultaneous Multiple Pulse-Repetition Frequency (SMPRF). When probing the ionosphere at the carrier frequencies of the EISCAT Incoherent Scatter Radar facilities, the range extent of the detectable target is typically of the order of one thousand kilometers - about seven milliseconds - whereas the characteristic correlation time of the scattered signal varies from a few milliseconds in the D-region to only tens of microseconds in the F-region. If one is interested in estimating the scattering autocorrelation function (ACF) at time lags shorter than the F-region correlation time, the D-region must be considered as a moderately overspread target, whereas the F-region is a severely overspread one. Given the technical restrictions of the radar hardware, a combination of ATC and phase-coded long pulses is advantageous for this kind of target. We evaluate such an experiment under infinitely low signal-to-noise ratio (SNR) conditions using lag profile inversion. In addition, a qualitative evaluation under high-SNR conditions is performed by analysing simulated data. The results show that an acceptable estimation accuracy and a very good lag resolution in the D-region can be achieved with a pulse length long enough for simultaneous E- and F-region measurements with a reasonable lag extent. The new experiment design is tested with the EISCAT Tromsø VHF (224 MHz) radar. An example of a full D/E/F-region ACF from the test run is shown at the end of the paper.

  5. Structure and Operation of the ITS Code System

    NASA Astrophysics Data System (ADS)

    Halbleib, J.

    The TIGER series of time-independent coupled electron-photon Monte Carlo transport codes is a group of multimaterial and multidimensional codes designed to provide a state-of-the-art description of the production and transport of the electron-photon cascade by combining microscopic photon transport with a macroscopic random walk1 for electron transport. Major contributors to its evolution are listed in Table 10.1.

  6. FAA Smoke Transport Code

    Energy Science and Technology Software Center (ESTSC)

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a codemore » obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.« less

  7. Adaptation and visual coding

    PubMed Central

    Webster, Michael A.

    2011-01-01

    Visual coding is a highly dynamic process and continuously adapting to the current viewing context. The perceptual changes that result from adaptation to recently viewed stimuli remain a powerful and popular tool for analyzing sensory mechanisms and plasticity. Over the last decade, the footprints of this adaptation have been tracked to both higher and lower levels of the visual pathway and over a wider range of timescales, revealing that visual processing is much more adaptable than previously thought. This work has also revealed that the pattern of aftereffects is similar across many stimulus dimensions, pointing to common coding principles in which adaptation plays a central role. However, why visual coding adapts has yet to be fully answered. PMID:21602298

  8. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option will produce the most effective results.

  9. Highly overcomplete sparse coding

    NASA Astrophysics Data System (ADS)

    Olshausen, Bruno A.

    2013-03-01

    This paper explores sparse coding of natural images in the highly overcomplete regime. We show that as the overcompleteness ratio approaches l0x, new types of dictionary elements emerge beyond the classical Gabor function shape obtained from complete or only modestly overcomplete sparse coding. These more diverse dic­ tionaries allow images to be approximated with lower L1 norm (for a fixed SNR), and the coefficients exhibit steeper decay. We also evaluate the learned dictionaries in a denoising task, showing that higher degrees of overcompleteness yield modest gains in peformance.

  10. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.

  11. Securing mobile code.

    SciTech Connect

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements on this method as well as demonstrating its implementation for various algorithms. We also examine cryptographic techniques to achieve obfuscation including encrypted functions and offer a new application to digital signature algorithms. To better understand the lack of security proofs for obfuscation techniques, we examine in detail general theoretical models of obfuscation. We explain the need for formal models in order to obtain provable security and the progress made in this direction thus far. Finally we tackle the problem of verifying remote execution. We introduce some methods of verifying remote exponentiation computations and some insight into generic computation checking.

  12. Extended quantum color coding

    NASA Astrophysics Data System (ADS)

    Hayashi, A.; Hashimoto, T.; Horibe, M.

    2005-01-01

    The quantum color coding scheme proposed by Korff and Kempe [e-print quant-ph/0405086] is easily extended so that the color coding quantum system is allowed to be entangled with an extra auxiliary quantum system. It is shown that in the extended scheme we need only ˜2N quantum colors to order N objects in large N limit, whereas ˜N/e quantum colors are required in the original nonextended version. The maximum success probability has asymptotics expressed by the Tracy-Widom distribution of the largest eigenvalue of a random Gaussian unitary ensemble (GUE) matrix.

  13. Numerical MHD codes for modeling astrophysical flows

    NASA Astrophysics Data System (ADS)

    Koldoba, A. V.; Ustyugova, G. V.; Lii, P. S.; Comins, M. L.; Dyda, S.; Romanova, M. M.; Lovelace, R. V. E.

    2016-05-01

    We describe a Godunov-type magnetohydrodynamic (MHD) code based on the Miyoshi and Kusano (2005) solver which can be used to solve various astrophysical hydrodynamic and MHD problems. The energy equation is in the form of entropy conservation. The code has been implemented on several different coordinate systems: 2.5D axisymmetric cylindrical coordinates, 2D Cartesian coordinates, 2D plane polar coordinates, and fully 3D cylindrical coordinates. Viscosity and diffusivity are implemented in the code to control the accretion rate in the disk and the rate of penetration of the disk matter through the magnetic field lines. The code has been utilized for the numerical investigations of a number of different astrophysical problems, several examples of which are shown.

  14. Perceptually-Based Adaptive JPEG Coding

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Rosenholtz, Ruth; Null, Cynthia H. (Technical Monitor)

    1996-01-01

    An extension to the JPEG standard (ISO/IEC DIS 10918-3) allows spatial adaptive coding of still images. As with baseline JPEG coding, one quantization matrix applies to an entire image channel, but in addition the user may specify a multiplier for each 8 x 8 block, which multiplies the quantization matrix, yielding the new matrix for the block. MPEG 1 and 2 use much the same scheme, except there the multiplier changes only on macroblock boundaries. We propose a method for perceptual optimization of the set of multipliers. We compute the perceptual error for each block based upon DCT quantization error adjusted according to contrast sensitivity, light adaptation, and contrast masking, and pick the set of multipliers which yield maximally flat perceptual error over the blocks of the image. We investigate the bitrate savings due to this adaptive coding scheme and the relative importance of the different sorts of masking on adaptive coding.

  15. Coding capacity of complementary DNA strands.

    PubMed Central

    Casino, A; Cipollaro, M; Guerrini, A M; Mastrocinque, G; Spena, A; Scarlato, V

    1981-01-01

    A Fortran computer algorithm has been used to analyze the nucleotide sequence of several structural genes. The analysis performed on both coding and complementary DNA strands shows that whereas open reading frames shorter than 100 codons are randomly distributed on both DNA strands, open reading frames longer than 100 codons ("virtual genes") are significantly more frequent on the complementary DNA strand than on the coding one. These "virtual genes" were further investigated by looking at intron sequences, splicing points, signal sequences and by analyzing gene mutations. On the basis of this analysis coding and complementary DNA strands of several eukaryotic structural genes cannot be distinguished. In particular we suggest that the complementary DNA strand of the human epsilon-globin gene might indeed code for a protein. PMID:7015290

  16. Modular optimization code package: MOZAIK

    NASA Astrophysics Data System (ADS)

    Bekar, Kursat B.

    This dissertation addresses the development of a modular optimization code package, MOZAIK, for geometric shape optimization problems in nuclear engineering applications. MOZAIK's first mission, determining the optimal shape of the D2O moderator tank for the current and new beam tube configurations for the Penn State Breazeale Reactor's (PSBR) beam port facility, is used to demonstrate its capabilities and test its performance. MOZAIK was designed as a modular optimization sequence including three primary independent modules: the initializer, the physics and the optimizer, each having a specific task. By using fixed interface blocks among the modules, the code attains its two most important characteristics: generic form and modularity. The benefit of this modular structure is that the contents of the modules can be switched depending on the requirements of accuracy, computational efficiency, or compatibility with the other modules. Oak Ridge National Laboratory's discrete ordinates transport code TORT was selected as the transport solver in the physics module of MOZAIK, and two different optimizers, Min-max and Genetic Algorithms (GA), were implemented in the optimizer module of the code package. A distributed memory parallelism was also applied to MOZAIK via MPI (Message Passing Interface) to execute the physics module concurrently on a number of processors for various states in the same search. Moreover, dynamic scheduling was enabled to enhance load balance among the processors while running MOZAIK's physics module thus improving the parallel speedup and efficiency. In this way, the total computation time consumed by the physics module is reduced by a factor close to M, where M is the number of processors. This capability also encourages the use of MOZAIK for shape optimization problems in nuclear applications because many traditional codes related to radiation transport do not have parallel execution capability. A set of computational models based on the existing beam port configuration of the Penn State Breazeale Reactor (PSBR) was designed to test and validate the code package in its entirety, as well as its modules separately. The selected physics code, TORT, and the requisite data such as source distribution, cross-sections, and angular quadratures were comprehensively tested with these computational models. The modular feature and the parallel performance of the code package were also examined using these computational models. Another outcome of these computational models is to provide the necessary background information for determining the optimal shape of the D2O moderator tank for the new beam tube configurations for the PSBR's beam port facility. The first mission of the code package was completed successfully by determining the optimal tank shape which was sought for the current beam tube configuration and two new beam tube configurations for the PSBR's beam port facility. The performance of the new beam tube configurations and the current beam tube configuration were evaluated with the new optimal tank shapes determined by MOZAIK. Furthermore, the performance of the code package with the two different optimization strategies were analyzed showing that while GA is capable of achieving higher thermal beam intensity for a given beam tube setup, Min-max produces an optimal shape that is more amenable to machining and manufacturing. The optimal D2O moderator tank shape determined by MOZAIK with the current beam port configuration improves the thermal neutron beam intensity at the beam port exit end by 9.5%. Similarly, the new tangential beam port configuration (beam port near the core interface) with the optimal moderator tank shape determined by MOZAIK improves the thermal neutron beam intensity by a factor of 1.4 compared to the existing beam port configuration (with the existing D2O moderator tank). Another new beam port configuration, radial beam tube configuration, with the optimal moderator tank shape increases the thermal neutron beam intensity at the beam tube exit by a factor of 1.8. All these results indicate that MOZAIK is viable and effective and is ready for deployment to address shape optimization problems involving radiation transport in nuclear engineering applications.

  17. Refractoriness Enhances Temporal Coding by Auditory Nerve Fibers

    PubMed Central

    Avissar, Michael; Wittig, John H.; Saunders, James C.

    2013-01-01

    A universal property of spiking neurons is refractoriness, a transient decrease in discharge probability immediately following an action potential (spike). The refractory period lasts only one to a few milliseconds, but has the potential to affect temporal coding of acoustic stimuli by auditory neurons, which are capable of submillisecond spike-time precision. Here this possibility was investigated systematically by recording spike times from chicken auditory nerve fibers in vivo while stimulating with repeated pure tones at characteristic frequency. Refractory periods were tightly distributed, with a mean of 1.58 ms. A statistical model was developed to recapitulate each fiber's responses and then used to predict the effect of removing the refractory period on a cell-by-cell basis for two largely independent facets of temporal coding: faithful entrainment of interspike intervals to the stimulus frequency and precise synchronization of spike times to the stimulus phase. The ratio of the refractory period to the stimulus period predicted the impact of refractoriness on entrainment and synchronization. For ratios less than ∼0.9, refractoriness enhanced entrainment and this enhancement was often accompanied by an increase in spike-time precision. At higher ratios, little or no change in entrainment or synchronization was observed. Given the tight distribution of refractory periods, the ability of refractoriness to improve temporal coding is restricted to neurons responding to low-frequency stimuli. Enhanced encoding of low frequencies likely affects sound localization and pitch perception in the auditory system, as well as perception in nonauditory sensory modalities, because all spiking neurons exhibit refractoriness. PMID:23637161

  18. Benchmark study between FIDAP and a cellular automata code

    NASA Astrophysics Data System (ADS)

    Akau, R. L.; Stockman, H. W.

    A fluid flow benchmark exercise was conducted to compare results between a cellular automata code and FIDAP. Cellular automata codes are free from gridding constraints, and are generally used to model slow (Reynolds number approximately 1) flows around complex solid obstacles. However, the accuracy of cellular automata codes at higher Reynolds numbers, where inertial terms are significant, is not well-documented. In order to validate the cellular automata code, two fluids problems were investigated. For both problems, flow was assumed to be laminar, two-dimensional, isothermal, incompressible and periodic. Results showed that the cellular automata code simulated the overall behavior of the flow field.

  19. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  20. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  1. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  2. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  3. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  4. Grid-free tree-code simulations of the plasma-material interaction region

    NASA Astrophysics Data System (ADS)

    Salmagne, C.; Reiter, D.; Gibbon, P.

    2014-11-01

    A fully kinetic grid-free model based on a Barnes-Hut tree code is used to selfconsistently simulate a collisionless plasma bounded by two floating walls. The workhorse for simulating such plasma wall transition layers is currently the PIC method. However, the present grid-free formulation provides a powerful independent tool to test it [1] and to possibly extend particle simulations towards collisional regimes in a more internally consistent way. Here, we use the grid-free massively parallel Barnes-Hut tree-code PEPC - a well established tool for simulations of Laser-plasmas and astrophysical applications - to develop a 3D ab initio plasma target interaction model. With our approach an electrostatic sheath naturally builds up within the first couple of Debye lengths close to the wall rather than being imposed as a prescribed boundary condition. We verified the code using analytic results [2] as well as 1D PIC simulations [3]. The model was then used to investigate the influence of inclined magnetic fields on the plasma material interface. We used the code to study the correlation between the magnetic field angle and the angular distribution of incident particles.

  5. Enforcing the International Code of Marketing of Breast-milk Substitutes for Better Promotion of Exclusive Breastfeeding: Can Lessons Be Learned?

    PubMed

    Barennes, Hubert; Slesak, Guenther; Goyet, Sophie; Aaron, Percy; Srour, Leila M

    2016-02-01

    Exclusive breastfeeding, one of the best natural resources, needs protection and promotion. The International Code of Marketing of Breast-milk Substitutes (the Code), which aims to prevent the undermining of breastfeeding by formula advertising, faces implementation challenges. We reviewed frequently overlooked challenges and obstacles that the Code is facing worldwide, but particularly in Southeast Asia. Drawing lessons from various countries where we work, and following the example of successful public health interventions, we discussed legislation, enforcement, and experiences that are needed to successfully implement the Code. Successful holistic approaches that have strengthened the Code need to be scaled up. Community-based actions and peer-to-peer promotions have proved successful. Legislation without stringent enforcement and sufficient penalties is ineffective. The public needs education about the benefits and ways and means to support breastfeeding. It is crucial to combine strong political commitment and leadership with strict national regulations, definitions, and enforcement. National breastfeeding committees, with the authority to improve regulations, investigate violations, and enforce the laws, must be established. Systematic monitoring and reporting are needed to identify companies, individuals, intermediaries, and practices that infringe on the Code. Penalizing violators is crucial. Managers of multinational companies must be held accountable for international violations, and international legislative enforcement needs to be established. Further measures should include improved regulations to protect the breastfeeding mother: large-scale education campaigns; strong penalties for Code violators; exclusion of the formula industry from nutrition, education, and policy roles; supportive legal networks; and independent research of interventions supporting breastfeeding. PMID:26416439

  6. Channel-independent and sensor-independent stimulus representations

    NASA Astrophysics Data System (ADS)

    Levin, David N.

    2005-11-01

    This paper shows how a machine, which observes stimuli through an uncharacterized, uncalibrated channel and sensor, can glean machine-independent information (i.e., channel- and sensor-independent information) about the stimuli. This is possible if the following two conditions are satisfied by the observed stimulus and by the observing device, respectively: (1) the stimulus' trajectory in the space of all possible configurations has a well-defined local velocity covariance matrix; (2) the observing device's sensor state is invertibly related to the stimulus state. The first condition guarantees that the statistical properties of the stimulus time series endow the stimulus configuration space with a differential geometric structure (a metric and parallel transfer procedure), which can then be used to represent relative stimulus configurations in a coordinate-system-independent manner. This requirement is satisfied by a large variety of physical systems, and, in general, it is expected to be satisfied by stimulus trajecteries that densely cover stimulus state space and that have velocity distributions varying smoothly across that space. The second condition implies that the machine defines a specific coordinate system on the stimulus state space, with the nature of that coordinate system depending on the machine's channels and detectors. Thus, machines with different channels and sensors "see" the same stimulus trajectory through state space, but in different machine-specific coordinate systems. It is shown that this requirement is almost certainly satisfied by any device that measures more than 2d independent properties of the stimulus, where d is the number of stimulus degrees of freedom. Taken together, the two conditions guarantee that the observing device can record the stimulus time series in its machine-specific coordinate system and then derive coordinate-system-independent (and, therefore, machine-independent) representations of relative stimulus configurations. The resulting description is an "inner" property of the stimulus time series in the sense that it does not depend on extrinsic factors such as the observer's choice of a coordinate system in which the stimulus is viewed (i.e., the observer's choice of channels and sensors). In other words, the resulting description is an intrinsic property of the evolution of the "real" stimulus that is "out there" broadcasting energy to the observer. This methodology is illustrated with analytic examples and with a numerically simulated experiment. In an intelligent sensory device, this kind of representation "engine" could function as a "front end" that passes channel- and sensor-independent stimulus representations to a pattern recognition module. After a pattern recognizer has been trained in one of these devices, it could be used without a change in other devices having different channels and sensors.

  7. Accumulate Repeat Accumulate Coded Modulation

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

  8. Video Coding for ESL.

    ERIC Educational Resources Information Center

    King, Kevin

    1992-01-01

    Coding tasks, a valuable technique for teaching English as a Second Language, are presented that enable students to look at patterns and structures of marital communication as well as objectively evaluate the degree of happiness or distress in the marriage. (seven references) (JL)

  9. The Redox Code

    PubMed Central

    Jones, Dean P.

    2015-01-01

    Abstract Significance: The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O2 and H2O2 contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Recent Advances: Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Critical Issues: Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Future Directions: Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine. Antioxid. Redox Signal. 23, 734–746. PMID:25891126

  10. Electrical Circuit Simulation Code

    Energy Science and Technology Software Center (ESTSC)

    2001-08-09

    Massively-Parallel Electrical Circuit Simulation Code. CHILESPICE is a massively-arallel distributed-memory electrical circuit simulation tool that contains many enhanced radiation, time-based, and thermal features and models. Large scale electronic circuit simulation. Shared memory, parallel processing, enhance convergence. Sandia specific device models.

  11. Odor Coding Sensor

    NASA Astrophysics Data System (ADS)

    Hayashi, Kenshi

    Odor is a one of important sensing parameters for human life. However, odor has not been quantified by a measuring instrument because of its vagueness. In this paper, a measuring of odor with odor coding, which are vector quantities of plural odor molecular information, and its applications are described.

  12. Sharing the Code.

    ERIC Educational Resources Information Center

    Olsen, Florence

    2003-01-01

    Colleges and universities are beginning to consider collaborating on open-source-code projects as a way to meet critical software and computing needs. Points out the attractive features of noncommercial open-source software and describes some examples in use now, especially for the creation of Web infrastructure. (SLD)

  13. Environmental Fluid Dynamics Code

    EPA Science Inventory

    The Environmental Fluid Dynamics Code (EFDC)is a state-of-the-art hydrodynamic model that can be used to simulate aquatic systems in one, two, and three dimensions. It has evolved over the past two decades to become one of the most widely used and technically defensible hydrodyn...

  14. Student Dress Codes.

    ERIC Educational Resources Information Center

    Uerling, Donald F.

    School officials see a need for regulations that prohibit disruptive and inappropriate forms of expression and attire; students see these regulations as unwanted restrictions on their freedom. This paper reviews court litigation involving constitutional limitations on school authority, dress and hair codes, state law constraints, and school…

  15. Multiple trellis coded modulation

    NASA Technical Reports Server (NTRS)

    Simon, Marvin K. (Inventor); Divsalar, Dariush (Inventor)

    1990-01-01

    A technique for designing trellis codes to minimize bit error performance for a fading channel. The invention provides a criteria which may be used in the design of such codes which is significantly different from that used for average white Gaussian noise channels. The method of multiple trellis coded modulation of the present invention comprises the steps of: (a) coding b bits of input data into s intermediate outputs; (b) grouping said s intermediate outputs into k groups of s.sub.i intermediate outputs each where the summation of all s.sub.i,s is equal to s and k is equal to at least 2; (c) mapping each of said k groups of intermediate outputs into one of a plurality of symbols in accordance with a plurality of modulation schemes, one for each group such that the first group is mapped in accordance with a first modulation scheme and the second group is mapped in accordance with a second modulation scheme; and (d) outputting each of said symbols to provide k output symbols for each b bits of input data.

  16. The revised genetic code

    NASA Astrophysics Data System (ADS)

    Ninio, Jacques

    1990-03-01

    Recent findings on the genetic code are reviewed, including selenocysteine usage, deviations in the assignments of sense and nonsense codons, RNA editing, natural ribosomal frameshifts and non-orthodox codon-anticodon pairings. A multi-stage codon reading process is presented.

  17. Code of Ethics.

    ERIC Educational Resources Information Center

    Association of College Unions-International, Bloomington, IN.

    The code of ethics for the college union and student activities professional is presented by the Association of College Unions-International. The preamble identifies the objectives of the college union as providing campus community centers and social programs that enhance the quality of life for members of the academic community. Ethics for…

  18. Dress Codes and Uniforms.

    ERIC Educational Resources Information Center

    Lumsden, Linda; Miller, Gabriel

    2002-01-01

    Students do not always make choices that adults agree with in their choice of school dress. Dress-code issues are explored in this Research Roundup, and guidance is offered to principals seeking to maintain a positive school climate. In "Do School Uniforms Fit?" Kerry White discusses arguments for and against school uniforms and summarizes the…

  19. Splicing code modeling.

    PubMed

    Barash, Yoseph; Vaquero-Garcia, Jorge

    2014-01-01

    How do cis and trans elements involved in pre-mRNA splicing come together to form a splicing "code"? This question has been a driver of much of the research involving RNA biogenesis. The variability of splicing outcome across developmental stages and between tissues coupled with association of splicing defects with numerous diseases highlights the importance of such a code. However, the sheer number of elements involved in splicing regulation and the context-specific manner of their operation have made the derivation of such a code challenging. Recently, machine learning-based methods have been developed to infer computational models for a splicing code. These methods use high-throughput experiments measuring mRNA expression at exonic resolution and binding locations of RNA-binding proteins (RBPs) to infer what the regulatory elements that control the inclusion of a given pre-mRNA segment are. The inferred regulatory models can then be applied to genomic sequences or experimental conditions that have not been measured to predict splicing outcome. Moreover, the models themselves can be interrogated to identify new regulatory mechanisms, which can be subsequently tested experimentally. In this chapter, we survey the current state of this technology, and illustrate how it can be applied by non-computational or RNA splicing experts to study regulation of specific exons by using the AVISPA web tool. PMID:25201114

  20. Code Optimization Techniques

    SciTech Connect

    MAGEE,GLEN I.

    2000-08-03

    Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flight modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.

  1. The Sign Rule and Beyond: Boundary Effects, Flexibility, and Noise Correlations in Neural Population Codes

    PubMed Central

    Hu, Yu; Zylberberg, Joel; Shea-Brown, Eric

    2014-01-01

    Over repeat presentations of the same stimulus, sensory neurons show variable responses. This “noise” is typically correlated between pairs of cells, and a question with rich history in neuroscience is how these noise correlations impact the population's ability to encode the stimulus. Here, we consider a very general setting for population coding, investigating how information varies as a function of noise correlations, with all other aspects of the problem – neural tuning curves, etc. – held fixed. This work yields unifying insights into the role of noise correlations. These are summarized in the form of theorems, and illustrated with numerical examples involving neurons with diverse tuning curves. Our main contributions are as follows. (1) We generalize previous results to prove a sign rule (SR) — if noise correlations between pairs of neurons have opposite signs vs. their signal correlations, then coding performance will improve compared to the independent case. This holds for three different metrics of coding performance, and for arbitrary tuning curves and levels of heterogeneity. This generality is true for our other results as well. (2) As also pointed out in the literature, the SR does not provide a necessary condition for good coding. We show that a diverse set of correlation structures can improve coding. Many of these violate the SR, as do experimentally observed correlations. There is structure to this diversity: we prove that the optimal correlation structures must lie on boundaries of the possible set of noise correlations. (3) We provide a novel set of necessary and sufficient conditions, under which the coding performance (in the presence of noise) will be as good as it would be if there were no noise present at all. PMID:24586128

  2. The Independent Technical Analysis Process

    SciTech Connect

    Duberstein, Corey A.; Ham, Kenneth D.; Dauble, Dennis D.; Johnson, Gary E.

    2007-04-13

    The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. In the past, regional parties have interacted with a single entity, the Fish Passage Center to access the data, analyses, and coordination related to fish passage. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities.

  3. Dimension independence in exterior algebra.

    PubMed Central

    Hawrylycz, M

    1995-01-01

    The identities between homogeneous expressions in rank 1 vectors and rank n - 1 covectors in a Grassmann-Cayley algebra of rank n, in which one set occurs multilinearly, are shown to represent a set of dimension-independent identities. The theorem yields an infinite set of nontrivial geometric identities from a given identity. PMID:11607520

  4. Instructional Materials in Independent Living.

    ERIC Educational Resources Information Center

    Smith, Bradley C.; Fry, Ronald R.

    This annotated list of 103 instructional materials for use in an independent living program focused on personal, social, and community adjustment of those with special needs is cross referenced using a subject index that lists skill areas within a fourteen-category system. Document descriptions are arranged alphabetically by author and include…

  5. Haptic Tracking Permits Bimanual Independence

    ERIC Educational Resources Information Center

    Rosenbaum, David A.; Dawson, Amanda A.; Challis, John H.

    2006-01-01

    This study shows that in a novel task--bimanual haptic tracking--neurologically normal human adults can move their 2 hands independently for extended periods of time with little or no training. Participants lightly touched buttons whose positions were moved either quasi-randomly in the horizontal plane by 1 or 2 human drivers (Experiment 1), in…

  6. 10 Questions about Independent Reading

    ERIC Educational Resources Information Center

    Truby, Dana

    2012-01-01

    Teachers know that establishing a robust independent reading program takes more than giving kids a little quiet time after lunch. But how do they set up a program that will maximize their students' gains? Teachers have to know their students' reading levels inside and out, help them find just-right books, and continue to guide them during…

  7. Boston: Cradle of American Independence

    ERIC Educational Resources Information Center

    Community College Journal, 2004

    2004-01-01

    The 2005 American Association of Community Colleges Annual Convention will be held April 6-9 in Boston. While thoroughly modern, the iconic city's identity is firmly rooted in the past. As the cradle of American independence, Boston's long history is an integral part of the American fabric. Adams, Revere, Hancock are more than historical figures;…

  8. Strategic Planning for Independent Schools.

    ERIC Educational Resources Information Center

    Stone, Susan C.

    This manual is intended to serve independent schools beginning strategic planning methods. Chapter 1, "The Case for Strategic Planning," suggests replacing the term "long range planning" with the term "strategic planning," which emphasizes change. The strategic planning and policy development process begins with careful organization to ensure…

  9. Field Independence: Reviewing the Evidence

    ERIC Educational Resources Information Center

    Evans, Carol; Richardson, John T. E.; Waring, Michael

    2013-01-01

    Background: The construct of ?eld independence (FI) remains one of the most widely cited notions in research on cognitive style and on learning and instruction more generally. However, a great deal of confusion continues to exist around the de?nition of FI, its measurement, and the interpretation of research results, all of which have served to…

  10. High-Fidelity Coding with Correlated Neurons

    PubMed Central

    da Silveira, Rava Azeredo; Berry, Michael J.

    2014-01-01

    Positive correlations in the activity of neurons are widely observed in the brain. Previous studies have shown these correlations to be detrimental to the fidelity of population codes, or at best marginally favorable compared to independent codes. Here, we show that positive correlations can enhance coding performance by astronomical factors. Specifically, the probability of discrimination error can be suppressed by many orders of magnitude. Likewise, the number of stimuli encoded—the capacity—can be enhanced more than tenfold. These effects do not necessitate unrealistic correlation values, and can occur for populations with a few tens of neurons. We further show that both effects benefit from heterogeneity commonly seen in population activity. Error suppression and capacity enhancement rest upon a pattern of correlation. Tuning of one or several effective parameters can yield a limit of perfect coding: the corresponding pattern of positive correlation leads to a ‘lock-in’ of response probabilities that eliminates variability in the subspace relevant for stimulus discrimination. We discuss the nature of this pattern and we suggest experimental tests to identify it. PMID:25412463

  11. Preliminary Results for Coded Aperture Plasma Diagnostic

    NASA Astrophysics Data System (ADS)

    Haw, Magnus; Bellan, Paul

    2014-10-01

    A 1D coded aperture camera has been developed as a prototype for a high speed, wavelength-independent, plasma imaging diagnostic. Images are obtained via a coded or masked aperture that modulates incoming light to produce an invertible linear transform of the image on a detector. The system requires no lenses or mirrors and can be thought of as a multiplexed pinhole camera (with comparable resolution and greater signal than a single pinhole). The inexpensive custom-built system has a 13 1cm field of view, a vertical spatial resolution of 2 mm, and a temporal resolution of 1 ?s. Visible light images of the Caltech MHD-driven jet experiment agree with simultaneous images obtained with a conventional camera. For the simple jet geometry, the system can also extract depth information from single images. Further work will revolve around improving shielding and acquiring X-ray and EUV scintillators for imaging in those wavelengths. Supported by DOE, NSF.

  12. An experimental investigation of clocking effects on turbine aerodynamics using a modern 3-D one and one-half stage high pressure turbine for code verification and flow model development

    NASA Astrophysics Data System (ADS)

    Haldeman, Charles Waldo, IV

    2003-10-01

    This research uses a modern 1 and 1/2 stage high-pressure (HP) turbine operating at the proper design corrected speed, pressure ratio, and gas to metal temperature ratio to generate a detailed data set containing aerodynamic, heat-transfer and aero-performance information. The data was generated using the Ohio State University Gas Turbine Laboratory Turbine Test Facility (TTF), which is a short-duration shock tunnel facility. The research program utilizes an uncooled turbine stage for which all three airfoils are heavily instrumented at multiple spans and on the HPV and LPV endwalls and HPB platform and tips. Heat-flux and pressure data are obtained using the traditional shock-tube and blowdown facility operational modes. Detailed examination show that the aerodynamic (pressure) data obtained in the blowdown mode is the same as obtained in the shock-tube mode when the corrected conditions are matched. Various experimental conditions and configurations were performed, including LPV clocking positions, off-design corrected speed conditions, pressure ratio changes, and Reynolds number changes. The main research for this dissertation is concentrated on the LPV clocking experiments, where the LPV was clocked relative to the HPV at several different passage locations and at different Reynolds numbers. Various methods were used to evaluate the effect of clocking on both the aeroperformance (efficiency) and aerodynamics (pressure loading) on the LPV, including time-resolved measurements, time-averaged measurements and stage performance measurements. A general improvement in overall efficiency of approximately 2% is demonstrated and could be observed using a variety of independent methods. Maximum efficiency is obtained when the time-average pressures are highest on the LPV, and the time-resolved data both in the time domain and frequency domain show the least amount of variation. The gain in aeroperformance is obtained by integrating over the entire airfoil as the three-dimensional effects on the LPV surface are significant.

  13. An Eye-Tracking Study of How Color Coding Affects Multimedia Learning

    ERIC Educational Resources Information Center

    Ozcelik, Erol; Karakus, Turkan; Kursun, Engin; Cagiltay, Kursat

    2009-01-01

    Color coding has been proposed to promote more effective learning. However, insufficient evidence currently exists to show how color coding leads to better learning. The goal of this study was to investigate the underlying cause of the color coding effect by utilizing eye movement data. Fifty-two participants studied either a color-coded or…

  14. An Eye-Tracking Study of How Color Coding Affects Multimedia Learning

    ERIC Educational Resources Information Center

    Ozcelik, Erol; Karakus, Turkan; Kursun, Engin; Cagiltay, Kursat

    2009-01-01

    Color coding has been proposed to promote more effective learning. However, insufficient evidence currently exists to show how color coding leads to better learning. The goal of this study was to investigate the underlying cause of the color coding effect by utilizing eye movement data. Fifty-two participants studied either a color-coded or

  15. Allocentric coding: spatial range and combination rules.

    PubMed

    Camors, D; Jouffrais, C; Cottereau, B R; Durand, J B

    2015-04-01

    When a visual target is presented with neighboring landmarks, its location can be determined both relative to the self (egocentric coding) and relative to these landmarks (allocentric coding). In the present study, we investigated (1) how allocentric coding depends on the distance between the targets and their surrounding landmarks (i.e. the spatial range) and (2) how allocentric and egocentric coding interact with each other across targets-landmarks distances (i.e. the combination rules). Subjects performed a memory-based pointing task toward previously gazed targets briefly superimposed (200ms) on background images of cluttered city landscapes. A variable portion of the images was occluded in order to control the distance between the targets and the closest potential landmarks within those images. The pointing responses were performed after large saccades and the reappearance of the images at their initial location. However, in some trials, the images' elements were slightly shifted (±3°) in order to introduce a subliminal conflict between the allocentric and egocentric reference frames. The influence of allocentric coding in the pointing responses was found to decrease with increasing target-landmarks distances, although it remained significant even at the largest distances (⩾10°). Interestingly, both the decreasing influence of allocentric coding and the concomitant increase in pointing responses variability were well captured by a Bayesian model in which the weighted combination of allocentric and egocentric cues is governed by a coupling prior. PMID:25749676

  16. Multiphysics Code Demonstrated for Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Melis, Matthew E.

    1998-01-01

    The utility of multidisciplinary analysis tools for aeropropulsion applications is being investigated at the NASA Lewis Research Center. The goal of this project is to apply Spectrum, a multiphysics code developed by Centric Engineering Systems, Inc., to simulate multidisciplinary effects in turbomachinery components. Many engineering problems today involve detailed computer analyses to predict the thermal, aerodynamic, and structural response of a mechanical system as it undergoes service loading. Analysis of aerospace structures generally requires attention in all three disciplinary areas to adequately predict component service behavior, and in many cases, the results from one discipline substantially affect the outcome of the other two. There are numerous computer codes currently available in the engineering community to perform such analyses in each of these disciplines. Many of these codes are developed and used in-house by a given organization, and many are commercially available. However, few, if any, of these codes are designed specifically for multidisciplinary analyses. The Spectrum code has been developed for performing fully coupled fluid, thermal, and structural analyses on a mechanical system with a single simulation that accounts for all simultaneous interactions, thus eliminating the requirement for running a large number of sequential, separate, disciplinary analyses. The Spectrum code has a true multiphysics analysis capability, which improves analysis efficiency as well as accuracy. Centric Engineering, Inc., working with a team of Lewis and AlliedSignal Engines engineers, has been evaluating Spectrum for a variety of propulsion applications including disk quenching, drum cavity flow, aeromechanical simulations, and a centrifugal compressor flow simulation.

  17. Wire Transport Code

    SciTech Connect

    Caporaso, G.J.; Cole, A.G.

    1983-03-01

    The Wire Transport Code was developed to study the dynamics of relativistic-electron-beam propagation in the transport tube in which a wire-conditioning zone is present. In order for the beam to propagate successfully in the transport section it must be matched onto the wire by focusing elements. The beam must then be controlled by strong lenses as it exits the wire zone. The wire transport code was developed to model this process in substantial detail. It is able to treat axially symmetric problems as well as those in which the beam is transversely displaced from the axis of the transport tube. The focusing effects of foils and various beamline lenses are included in the calculations.

  18. Confocal coded aperture imaging

    DOEpatents

    Tobin, Jr., Kenneth William; Thomas, Jr., Clarence E.

    2001-01-01

    A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

  19. Reading a neural code.

    PubMed

    Bialek, W; Rieke, F; de Ruyter van Steveninck, R R; Warland, D

    1991-06-28

    Traditional approaches to neural coding characterize the encoding of known stimuli in average neural responses. Organisms face nearly the opposite task--extracting information about an unknown time-dependent stimulus from short segments of a spike train. Here the neural code was characterized from the point of view of the organism, culminating in algorithms for real-time stimulus estimation based on a single example of the spike train. These methods were applied to an identified movement-sensitive neuron in the fly visual system. Such decoding experiments determined the effective noise level and fault tolerance of neural computation, and the structure of the decoding algorithms suggested a simple model for real-time analog signal processing with spiking neurons. PMID:2063199

  20. Coding isotropic images

    NASA Technical Reports Server (NTRS)

    Oneal, J. B., Jr.; Natarajan, T. R.

    1976-01-01

    Rate distortion functions for two-dimensional homogeneous isotropic images are compared with the performance of 5 source encoders designed for such images. Both unweighted and frequency weighted mean square error distortion measures are considered. The coders considered are differential PCM (DPCM) using six previous samples in the prediction, herein called 6 pel (picutre element) DPCM; simple DPCM using single sample prediction; 6 pel DPCM followed by entropy coding; 8 x 8 discrete cosine transform coder, and 4 x 4 Hadamard transform coder. Other transform coders were studied and found to have about the same performance as the two transform coders above. With the mean square error distortion measure DPCM with entropy coding performed best. The relative performance of the coders changes slightly when the distortion measure is frequency weighted mean square error. The performance of all the coders was separated by only about 4 dB.

  1. Adaptive compression coding

    NASA Astrophysics Data System (ADS)

    Nasiopoulos, Panos; Ward, Rabab K.; Morse, Daryl J.

    1991-08-01

    A compression technique which preserves edges in compressed pictures is developed. The proposed compression algorithm adapts itself to the local nature of the image. Smooth regions are represented by their averages and edges are preserved using quad trees. Textured regions are encoded using BTC (block truncation coding) and a modification of BTC using look-up tables. A threshold using a range which is the difference between the maximum and the minimum grey levels in a 4 x 4 pixel quadrant is used. At the recommended value of the threshold (equal to 18), the quality of the compressed texture regions is very high, the same as that of AMBTC (absolute moment block truncation coding), but the edge preservation quality is far superior to that of AMBTC. Compression levels below 0.5-0.8 b/pixel may be achieved.

  2. Finite Element Analysis Code

    Energy Science and Technology Software Center (ESTSC)

    2006-03-08

    MAPVAR-KD is designed to transfer solution results from one finite element mesh to another. MAPVAR-KD draws heavily from the structure and coding of MERLIN II, but it employs a new finite element data base, EXODUS II, and offers enhanced speed and new capabilities not available in MERLIN II. In keeping with the MERLIN II documentation, the computational algorithms used in MAPVAR-KD are described. User instructions are presented. Example problems are included to demonstrate the operationmore » of the code and the effects of various input options. MAPVAR-KD is a modification of MAPVAR in which the search algorithm was replaced by a kd-tree-based search for better performance on large problems.« less

  3. On quantum network coding

    NASA Astrophysics Data System (ADS)

    Jain, Avinash; Franceschetti, Massimo; Meyer, David A.

    2011-03-01

    We study the problem of error-free multiple unicast over directed acyclic networks in a quantum setting. We provide a new information-theoretic proof of the known result that network coding does not achieve a larger quantum information flow than what can be achieved by routing for two-pair communication on the butterfly network. We then consider a k-pair multiple unicast problem and for all k ⩾ 2 we show that there exists a family of networks where quantum network coding achieves k-times larger quantum information flow than what can be achieved by routing. Finally, we specify a graph-theoretic sufficient condition for the quantum information flow of any multiple unicast problem to be bounded by the capacity of any sparsest multicut of the network.

  4. Quantum error-correcting codes over mixed alphabets

    NASA Astrophysics Data System (ADS)

    Wang, Zhuo; Yu, Sixia; Fan, Heng; Oh, C. H.

    2013-08-01

    We study the quantum error-correcting codes over mixed alphabets to deal with a more complicated and practical situation in which the physical systems for encoding may have different numbers of energy levels. In particular we investigate their constructions and propose the theory of quantum Singleton bound. Two kinds of code constructions are presented: a projection-based construction for general case and a graphical construction based on a graph-theoretical object composite coding clique dealing with the case of reducible alphabets. We find out some optimal one-error correcting or detecting codes over two alphabets. Our method of composite coding clique also sheds light on constructing standard quantum error-correcting codes, and other families of optimal codes are found.

  5. The TESS (Tandem Experiment Simulation Studies) computer code user's manual

    SciTech Connect

    Procassini, R.J. . Dept. of Nuclear Engineering); Cohen, B.I. )

    1990-06-01

    TESS (Tandem Experiment Simulation Studies) is a one-dimensional, bounded particle-in-cell (PIC) simulation code designed to investigate the confinement and transport of plasma in a magnetic mirror device, including tandem mirror configurations. Mirror plasmas may be modeled in a system which includes an applied magnetic field and/or a self-consistent or applied electrostatic potential. The PIC code TESS is similar to the PIC code DIPSI (Direct Implicit Plasma Surface Interactions) which is designed to study plasma transport to and interaction with a solid surface. The codes TESS and DIPSI are direct descendants of the PIC code ES1 that was created by A. B. Langdon. This document provides the user with a brief description of the methods used in the code and a tutorial on the use of the code. 10 refs., 2 tabs.

  6. VAC: Versatile Advection Code

    NASA Astrophysics Data System (ADS)

    Tóth, Gábor; Keppens, Rony

    2012-07-01

    The Versatile Advection Code (VAC) is a freely available general hydrodynamic and magnetohydrodynamic simulation software that works in 1, 2 or 3 dimensions on Cartesian and logically Cartesian grids. VAC runs on any Unix/Linux system with a Fortran 90 (or 77) compiler and Perl interpreter. VAC can run on parallel machines using either the Message Passing Interface (MPI) library or a High Performance Fortran (HPF) compiler.

  7. Reeds computer code

    NASA Technical Reports Server (NTRS)

    Bjork, C.

    1981-01-01

    The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO.

  8. Status of MARS Code

    SciTech Connect

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  9. Bar coded retroreflective target

    DOEpatents

    Vann, Charles S.

    2000-01-01

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  10. Moment code BEDLAM

    SciTech Connect

    Channell, P.J.; Healy, L.M.; Lysenko, W.P.

    1985-01-01

    BEDLAM is a fourth-order moment simulation code. The beam at the input to a linear accelerator is specified as a collection of moments of the phase-space distribution. Then the moment equations, which describe the time evolution of the moments, are numerically integrated. No particles are traced in this approach. The accuracy of the computed distribution, the external forces, and the space-charge forces are computed consistently to a given order. Although BEDLAM includes moments to fourth order only, it could be systematically extended to any order. Another feature of this method is that physically interesting and intuitive quantities, such as beam sizes and rms emittances, are computed directly. This paper describes the status of BEDLAM and presents the results of some tests. We simulated a section of radio-frequency quadrupole (RFQ) linac, neglecting space charge, to test the new code. Agreement with a Particle-In-Cell (PIC) simulation was excellent. We also verified that the fourth-order solution is more accurate than the second-order solution, which indicates the convergence of the method. We believe these results justify the continued development of moment simulation codes.

  11. Orthopedics coding and funding.

    PubMed

    Baron, S; Duclos, C; Thoreux, P

    2014-02-01

    The French tarification à l'activité (T2A) prospective payment system is a financial system in which a health-care institution's resources are based on performed activity. Activity is described via the PMSI medical information system (programme de médicalisation du système d'information). The PMSI classifies hospital cases by clinical and economic categories known as diagnosis-related groups (DRG), each with an associated price tag. Coding a hospital case involves giving as realistic a description as possible so as to categorize it in the right DRG and thus ensure appropriate payment. For this, it is essential to understand what determines the pricing of inpatient stay: namely, the code for the surgical procedure, the patient's principal diagnosis (reason for admission), codes for comorbidities (everything that adds to management burden), and the management of the length of inpatient stay. The PMSI is used to analyze the institution's activity and dynamism: change on previous year, relation to target, and comparison with competing institutions based on indicators such as the mean length of stay performance indicator (MLS PI). The T2A system improves overall care efficiency. Quality of care, however, is not presently taken account of in the payment made to the institution, as there are no indicators for this; work needs to be done on this topic. PMID:24461230

  12. MELCOR computer code manuals

    SciTech Connect

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  13. Progress in cultivation-independent phyllosphere microbiology.

    PubMed

    Müller, Thomas; Ruppel, Silke

    2014-01-01

    Most microorganisms of the phyllosphere are nonculturable in commonly used media and culture conditions, as are those in other natural environments. This review queries the reasons for their 'noncultivability' and assesses developments in phyllospere microbiology that have been achieved cultivation independently over the last 4 years. Analyses of total microbial communities have revealed a comprehensive microbial diversity. 16S rRNA gene amplicon sequencing and metagenomic sequencing were applied to investigate plant species, location and season as variables affecting the composition of these communities. In continuation to culture-based enzymatic and metabolic studies with individual isolates, metaproteogenomic approaches reveal a great potential to study the physiology of microbial communities in situ. Culture-independent microbiological technologies as well advances in plant genetics and biochemistry provide methodological preconditions for exploring the interactions between plants and their microbiome in the phyllosphere. Improving and combining cultivation and culture-independent techniques can contribute to a better understanding of the phyllosphere ecology. This is essential, for example, to avoid human-pathogenic bacteria in plant food. PMID:24003903

  14. Progress in cultivation-independent phyllosphere microbiology

    PubMed Central

    Müller, Thomas; Ruppel, Silke

    2014-01-01

    Most microorganisms of the phyllosphere are nonculturable in commonly used media and culture conditions, as are those in other natural environments. This review queries the reasons for their ‘noncultivability’ and assesses developments in phyllospere microbiology that have been achieved cultivation independently over the last 4 years. Analyses of total microbial communities have revealed a comprehensive microbial diversity. 16S rRNA gene amplicon sequencing and metagenomic sequencing were applied to investigate plant species, location and season as variables affecting the composition of these communities. In continuation to culture-based enzymatic and metabolic studies with individual isolates, metaproteogenomic approaches reveal a great potential to study the physiology of microbial communities in situ. Culture-independent microbiological technologies as well advances in plant genetics and biochemistry provide methodological preconditions for exploring the interactions between plants and their microbiome in the phyllosphere. Improving and combining cultivation and culture-independent techniques can contribute to a better understanding of the phyllosphere ecology. This is essential, for example, to avoid human–pathogenic bacteria in plant food. PMID:24003903

  15. Experimental measurement-device-independent entanglement detection.

    PubMed

    Nawareg, Mohamed; Muhammad, Sadiq; Amselem, Elias; Bourennane, Mohamed

    2015-01-01

    Entanglement is one of the most puzzling features of quantum theory and of great importance for the new field of quantum information. The determination whether a given state is entangled or not is one of the most challenging open problems of the field. Here we report on the experimental demonstration of measurement-device-independent (MDI) entanglement detection using witness method for general two qubits photon polarization systems. In the MDI settings, there is no requirement to assume perfect implementations or neither to trust the measurement devices. This experimental demonstration can be generalized for the investigation of properties of quantum systems and for the realization of cryptography and communication protocols. PMID:25649664

  16. Experimental Measurement-Device-Independent Entanglement Detection

    NASA Astrophysics Data System (ADS)

    Nawareg, Mohamed; Muhammad, Sadiq; Amselem, Elias; Bourennane, Mohamed

    2015-02-01

    Entanglement is one of the most puzzling features of quantum theory and of great importance for the new field of quantum information. The determination whether a given state is entangled or not is one of the most challenging open problems of the field. Here we report on the experimental demonstration of measurement-device-independent (MDI) entanglement detection using witness method for general two qubits photon polarization systems. In the MDI settings, there is no requirement to assume perfect implementations or neither to trust the measurement devices. This experimental demonstration can be generalized for the investigation of properties of quantum systems and for the realization of cryptography and communication protocols.

  17. Experimental Measurement-Device-Independent Entanglement Detection

    PubMed Central

    Nawareg, Mohamed; Muhammad, Sadiq; Amselem, Elias; Bourennane, Mohamed

    2015-01-01

    Entanglement is one of the most puzzling features of quantum theory and of great importance for the new field of quantum information. The determination whether a given state is entangled or not is one of the most challenging open problems of the field. Here we report on the experimental demonstration of measurement-device-independent (MDI) entanglement detection using witness method for general two qubits photon polarization systems. In the MDI settings, there is no requirement to assume perfect implementations or neither to trust the measurement devices. This experimental demonstration can be generalized for the investigation of properties of quantum systems and for the realization of cryptography and communication protocols. PMID:25649664

  18. COLD-SAT Dynamic Model Computer Code

    NASA Technical Reports Server (NTRS)

    Bollenbacher, G.; Adams, N. S.

    1995-01-01

    COLD-SAT Dynamic Model (CSDM) computer code implements six-degree-of-freedom, rigid-body mathematical model for simulation of spacecraft in orbit around Earth. Investigates flow dynamics and thermodynamics of subcritical cryogenic fluids in microgravity. Consists of three parts: translation model, rotation model, and slosh model. Written in FORTRAN 77.

  19. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  20. Polynomial weights and code constructions.

    NASA Technical Reports Server (NTRS)

    Massey, J. L.; Costello, D. J., Jr.; Justesen, J.

    1973-01-01

    Study of certain polynomials with the 'weight-retaining' property that any linear combination of these polynomials with coefficients in a general finite field has Hamming weight at least as great as that of the minimum-degree polynomial included. This fundamental property is used in applications to Reed-Muller codes, a new class of 'repeated-root' binary cyclic codes, two new classes of binary convolutional codes derived from binary cyclic codes, and two new classes of binary convolutional codes derived from Reed-Solomon codes.

  1. Anatomy Of A Code Block

    NASA Astrophysics Data System (ADS)

    Cortez, Edward

    1981-12-01

    The purpose of this paper is to present a short definition of a MIL-STD-782 code block, to introduce the advantage of using a code block, to discuss the generation of a code block, and to identify two problems that have limited the wide-spread use of code block data annotation: the rate of information retrieval through-put, and the error rate associated with this information retrieval. Finally, an automatic code block reader, useful in alleviating these two problems is identified. With the introduction of the automatic reader, the use of the code block, long held in its infancy, can grow to full maturity.

  2. Object features fail independently in visual working memory: evidence for a probabilistic feature-store model.

    PubMed

    Fougnie, Daryl; Alvarez, George A

    2011-01-01

    The world is composed of features and objects and this structure may influence what is stored in working memory. It is widely believed that the content of memory is object-based: Memory stores integrated objects, not independent features. We asked participants to report the color and orientation of an object and found that memory errors were largely independent: Even when one of the object's features was entirely forgotten, the other feature was often reported. This finding contradicts object-based models and challenges fundamental assumptions about the organization of information in working memory. We propose an alternative framework involving independent self-sustaining representations that may fail probabilistically and independently for each feature. This account predicts that the degree of independence in feature storage is determined by the degree of overlap in neural coding during perception. Consistent with this prediction, we found that errors for jointly encoded dimensions were less independent than errors for independently encoded dimensions. PMID:21980189

  3. Development of the Code RITRACKS

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Cucinotta, Francis A.

    2013-01-01

    A document discusses the code RITRACKS (Relativistic Ion Tracks), which was developed to simulate heavy ion track structure at the microscopic and nanoscopic scales. It is a Monte-Carlo code that simulates the production of radiolytic species in water, event-by-event, and which may be used to simulate tracks and also to calculate dose in targets and voxels of different sizes. The dose deposited by the radiation can be calculated in nanovolumes (voxels). RITRACKS allows simulation of radiation tracks without the need of extensive knowledge of computer programming or Monte-Carlo simulations. It is installed as a regular application on Windows systems. The main input parameters entered by the user are the type and energy of the ion, the length and size of the irradiated volume, the number of ions impacting the volume, and the number of histories. The simulation can be started after the input parameters are entered in the GUI. The number of each kind of interactions for each track is shown in the result details window. The tracks can be visualized in 3D after the simulation is complete. It is also possible to see the time evolution of the tracks and zoom on specific parts of the tracks. The software RITRACKS can be very useful for radiation scientists to investigate various problems in the fields of radiation physics, radiation chemistry, and radiation biology. For example, it can be used to simulate electron ejection experiments (radiation physics).

  4. Neural Coding for Effective Rehabilitation

    PubMed Central

    2014-01-01

    Successful neurological rehabilitation depends on accurate diagnosis, effective treatment, and quantitative evaluation. Neural coding, a technology for interpretation of functional and structural information of the nervous system, has contributed to the advancements in neuroimaging, brain-machine interface (BMI), and design of training devices for rehabilitation purposes. In this review, we summarized the latest breakthroughs in neuroimaging from microscale to macroscale levels with potential diagnostic applications for rehabilitation. We also reviewed the achievements in electrocorticography (ECoG) coding with both animal models and human beings for BMI design, electromyography (EMG) interpretation for interaction with external robotic systems, and robot-assisted quantitative evaluation on the progress of rehabilitation programs. Future rehabilitation would be more home-based, automatic, and self-served by patients. Further investigations and breakthroughs are mainly needed in aspects of improving the computational efficiency in neuroimaging and multichannel ECoG by selection of localized neuroinformatics, validation of the effectiveness in BMI guided rehabilitation programs, and simplification of the system operation in training devices. PMID:25258708

  5. The APS SASE FEL : modeling and code comparison.

    SciTech Connect

    Biedron, S. G.

    1999-04-20

    A self-amplified spontaneous emission (SASE) free-electron laser (FEL) is under construction at the Advanced Photon Source (APS). Five FEL simulation codes were used in the design phase: GENESIS, GINGER, MEDUSA, RON, and TDA3D. Initial comparisons between each of these independent formulations show good agreement for the parameters of the APS SASE FEL.

  6. Performance improvement of spectral amplitude coding-optical code division multiple access systems using NAND detection with enhanced double weight code

    NASA Astrophysics Data System (ADS)

    Ahmed, Nasim; Aljunid, Syed Alwee; Ahmad, R. Badlishah; Fadhil, Hilal A.; Rashid, Mohd Abdur

    2012-01-01

    The bit-error rate (BER) performance of the spectral amplitude coding-optical code division multiple access (SACOCDMA) system has been investigated by using NAND subtraction detection technique with enhanced double weight (EDW) code. The EDW code is the enhanced version of double weight (DW) code family where the code weight is any odd number and greater than one with ideal cross-correlation. In order to evaluate the performance of the system, we used mathematical analysis extensively along with the simulation experiment. The evaluation results obtained using the NAND subtraction detection technique was compared with those obtained using the complementary detection technique for the same number of active users. The comparison results revealed that the BER performance of the system using NAND subtraction detection technique has greatly been improved as compared to the complementary technique.

  7. Independent clinical trials: a commentary.

    PubMed

    Cocconi, Giorgio

    2002-01-01

    The so-called norms of good clinical practice have been incorporated into the Italian regulatory legislation governing clinical trials sponsored by pharmaceutical companies, but there are no legislative provisions governing independent clinical trials: ie those not sponsored by the industry. The pharmaceutical industry has recently increased considerably its commitment to sponsored trials by establishing a series of economic relationships with individual researchers and hospital or university institutions. It has also set up and strengthened a series of bodies and service companies with the aim of making the clinical trials "machine" more efficient. Such developments have aroused alarm in the medical literature because of the risk that they may have negative effects on the freedom of research and research results. At the same time, there is also the risk that independent clinical trials will be greatly penalized by having to compete with sponsored trials in terms of patient enrollment, and because they are currently having to face a series of difficulties connected with the lack or scarcity of economic resources provided by the State or non-profit organizations, with problems relating to patient insurance and to the availability of the necessary drugs. However, the objective of independent trials is to improve the medical art by answering specific diagnostic and therapeutic questions, whereas that of industry-sponsored trials is to generate money, directly or indirectly, by means of the registration of new drugs. It is therefore now necessary to ensure better surveillance of the influence of pharmaceutical companies over the trials they sponsor (as a minimum by ensuring the transparency of a series of potential conflicts of interest between them and clinical researchers) and, simultaneously, protect independent trials from coming to an inglorious end by means of specific support initiatives such as those proposed in this article. PMID:12088263

  8. Steps to Independent Living Series.

    ERIC Educational Resources Information Center

    Lobb, Nancy

    This set of six activity books and a teacher's guide is designed to help students from eighth grade to adulthood with special needs to learn independent living skills. The activity books have a reading level of 2.5 and address: (1) "How to Get Well When You're Sick or Hurt," including how to take a temperature, see a doctor, and use medicines…

  9. Independent evolution of four heme peroxidase superfamilies

    PubMed Central

    Zámocký, Marcel; Hofbauer, Stefan; Schaffner, Irene; Gasselhuber, Bernhard; Nicolussi, Andrea; Soudi, Monika; Pirker, Katharina F.; Furtmüller, Paul G.; Obinger, Christian

    2015-01-01

    Four heme peroxidase superfamilies (peroxidase–catalase, peroxidase–cyclooxygenase, peroxidase–chlorite dismutase and peroxidase–peroxygenase superfamily) arose independently during evolution, which differ in overall fold, active site architecture and enzymatic activities. The redox cofactor is heme b or posttranslationally modified heme that is ligated by either histidine or cysteine. Heme peroxidases are found in all kingdoms of life and typically catalyze the one- and two-electron oxidation of a myriad of organic and inorganic substrates. In addition to this peroxidatic activity distinct (sub)families show pronounced catalase, cyclooxygenase, chlorite dismutase or peroxygenase activities. Here we describe the phylogeny of these four superfamilies and present the most important sequence signatures and active site architectures. The classification of families is described as well as important turning points in evolution. We show that at least three heme peroxidase superfamilies have ancient prokaryotic roots with several alternative ways of divergent evolution. In later evolutionary steps, they almost always produced highly evolved and specialized clades of peroxidases in eukaryotic kingdoms with a significant portion of such genes involved in coding various fusion proteins with novel physiological functions. PMID:25575902

  10. Face Recognition by Independent Component Analysis

    PubMed Central

    Bartlett, Marian Stewart; Movellan, Javier R.; Sejnowski, Terrence J.

    2010-01-01

    A number of current face recognition algorithms use face representations found by unsupervised statistical methods. Typically these methods find a set of basis images and represent faces as a linear combination of those images. Principal component analysis (PCA) is a popular example of such methods. The basis images found by PCA depend only on pairwise relationships between pixels in the image database. In a task such as face recognition, in which important information may be contained in the high-order relationships among pixels, it seems reasonable to expect that better basis images may be found by methods sensitive to these high-order statistics. Independent component analysis (ICA), a generalization of PCA, is one such method. We used a version of ICA derived from the principle of optimal information transfer through sigmoidal neurons. ICA was performed on face images in the FERET database under two different architectures, one which treated the images as random variables and the pixels as outcomes, and a second which treated the pixels as random variables and the images as outcomes. The first architecture found spatially local basis images for the faces. The second architecture produced a factorial face code. Both ICA representations were superior to representations based on PCA for recognizing faces across days and changes in expression. A classifier that combined the two ICA representations gave the best performance. PMID:18244540

  11. On decoding of multi-level MPSK modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Gupta, Alok Kumar

    1990-01-01

    The decoding problem of multi-level block modulation codes is investigated. The hardware design of soft-decision Viterbi decoder for some short length 8-PSK block modulation codes is presented. An effective way to reduce the hardware complexity of the decoder by reducing the branch metric and path metric, using a non-uniform floating-point to integer mapping scheme, is proposed and discussed. The simulation results of the design are presented. The multi-stage decoding (MSD) of multi-level modulation codes is also investigated. The cases of soft-decision and hard-decision MSD are considered and their performance are evaluated for several codes of different lengths and different minimum squared Euclidean distances. It is shown that the soft-decision MSD reduces the decoding complexity drastically and it is suboptimum. The hard-decision MSD further simplifies the decoding while still maintaining a reasonable coding gain over the uncoded system, if the component codes are chosen properly. Finally, some basic 3-level 8-PSK modulation codes using BCH codes as component codes are constructed and their coding gains are found for hard decision multistage decoding.

  12. Associations between childrens independent mobility and physical activity

    PubMed Central

    2014-01-01

    Background Independent mobility describes the freedom of children to travel and play in public spaces without adult supervision. The potential benefits for children are significant such as social interactions with peers, spatial and traffic safety skills and increased physical activity. Yet, the health benefits of independent mobility, particularly on physical activity accumulation, are largely unexplored. This study aimed to investigate associations of childrens independent mobility with light, moderate-to-vigorous, and total physical activity accumulation. Methods In 2011 - 2012, 375 Australian children aged 8-13years (62% girls) were recruited into a cross-sectional study. Childrens independent mobility (i.e. independent travel to school and non-school destinations, independent outdoor play) and socio-demographics were assessed through child and parent surveys. Physical activity intensity was measured objectively through an Actiheart monitor worn on four consecutive days. Associations between independent mobility and physical activity variables were analysed using generalized linear models, accounting for clustered sampling, Actiheart wear time, socio-demographics, and assessing interactions by sex. Results Independent travel (walking, cycling, public transport) to school and non-school destinations were not associated with light, moderate-to-vigorous and total physical activity. However, sub-analyses revealed a positive association between independent walking and cycling (excluding public transport) to school and total physical but only in boys (b?=?36.03, p?independent outdoor play (three or more days per week) was positively associated with light and total physical activity (b?=?29.76, p?independent outdoor play and moderate-to-vigorous physical activity. When assessing differences by sex, the observed significant associations of independent outdoor play with light and total physical activity remained in girls but not in boys. All other associations showed no significant differences by sex. Conclusions Independent outdoor play may boost childrens daily physical activity levels, predominantly at light intensity. Hence, facilitating independent outdoor play could be a viable intervention strategy to enhance physical activity in children, particularly in girls. Associations between independent travel and physical activity are inconsistent overall and require further investigation. PMID:24476363

  13. Qudit color codes and gauge color codes in all spatial dimensions

    NASA Astrophysics Data System (ADS)

    Watson, Fern H. E.; Campbell, Earl T.; Anwar, Hussain; Browne, Dan E.

    2015-08-01

    Two-level quantum systems, qubits, are not the only basis for quantum computation. Advantages exist in using qudits, d -level quantum systems, as the basic carrier of quantum information. We show that color codes, a class of topological quantum codes with remarkable transversality properties, can be generalized to the qudit paradigm. In recent developments it was found that in three spatial dimensions a qubit color code can support a transversal non-Clifford gate and that in higher spatial dimensions additional non-Clifford gates can be found, saturating Bravyi and König's bound [S. Bravyi and R. König, Phys. Rev. Lett. 111, 170502 (2013), 10.1103/PhysRevLett.111.170502]. Furthermore, by using gauge fixing techniques, an effective set of Clifford gates can be achieved, removing the need for state distillation. We show that the qudit color code can support the qudit analogs of these gates and also show that in higher spatial dimensions a color code can support a phase gate from higher levels of the Clifford hierarchy that can be proven to saturate Bravyi and König's bound in all but a finite number of special cases. The methodology used is a generalization of Bravyi and Haah's method of triorthogonal matrices [S. Bravyi and J. Haah, Phys. Rev. A 86, 052329 (2012), 10.1103/PhysRevA.86.052329], which may be of independent interest. For completeness, we show explicitly that the qudit color codes generalize to gauge color codes and share many of the favorable properties of their qubit counterparts.

  14. ENSDF ANALYSIS AND UTILITY CODES.

    SciTech Connect

    BURROWS, T.

    2005-04-04

    The ENSDF analysis and checking codes are briefly described, along with their uses with various types of ENSDF datasets. For more information on the programs see ''Read Me'' entries and other documentation associated with each code.

  15. Open code for open science?

    NASA Astrophysics Data System (ADS)

    Easterbrook, Steve M.

    2014-11-01

    Open source software is often seen as a path to reproducibility in computational science. In practice there are many obstacles, even when the code is freely available, but open source policies should at least lead to better quality code.

  16. Noiseless coding for the magnetometer

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Lee, Jun-Ji

    1987-01-01

    Future unmanned space missions will continue to seek a full understanding of magnetic fields throughout the solar system. Severely constrained data rates during certain portions of these missions could limit the possible science return. This publication investigates the application of universal noiseless coding techniques to more efficiently represent magnetometer data without any loss in data integrity. Performance results indicated that compression factors of 2:1 to 6:1 can be expected. Feasibility for general deep space application was demonstrated by implementing a microprocessor breadboard coder/decoder using the Intel 8086 processor. The Comet Rendezvous Asteroid Flyby mission will incorporate these techniques in a buffer feedback, rate-controlled configuration. The characteristics of this system are discussed.

  17. Method of optical image coding by time integration

    NASA Astrophysics Data System (ADS)

    Evtikhiev, Nikolay N.; Starikov, Sergey N.; Cheryomkhin, Pavel A.; Krasnov, Vitaly V.; Rodin, Vladislav G.

    2012-06-01

    Method of optical image coding by time integration is proposed. Coding in proposed method is accomplished by shifting object image over photosensor area of digital camera during registration. It results in optically calculated convolution of original image with shifts trajectory. As opposed to optical coding methods based on the use of diffractive optical elements the described coding method is feasible for implementation in totally incoherent light. The method was preliminary tested by using LC monitor for image displaying and shifting. Shifting of object image is realized by displaying video consisting of frames with image to be encoded at different locations on screen of LC monitor while registering it by camera. Optical encoding and numerical decoding of test images were performed successfully. Also more practical experimental implementation of the method with use of LCOS SLM Holoeye PLUTO VIS was realized. Objects images to be encoded were formed in monochromatic spatially incoherent light. Shifting of object image over camera photosensor area was accomplished by displaying video consisting of frames with blazed gratings on LCOS SLM. Each blazed grating deflects reflecting from SLM light at different angle. Results of image optical coding and encoded images numerical restoration are presented. Obtained experimental results are compared with results of numerical modeling. Optical image coding with time integration could be used for accessible quality estimation of optical image coding using diffractive optical elements or as independent optical coding method which can be implemented in incoherent light.

  18. The impact of time step definition on code convergence and robustness

    NASA Technical Reports Server (NTRS)

    Venkateswaran, S.; Weiss, J. M.; Merkle, C. L.

    1992-01-01

    We have implemented preconditioning for multi-species reacting flows in two independent codes, an implicit (ADI) code developed in-house and the RPLUS code (developed at LeRC). The RPLUS code was modified to work on a four-stage Runge-Kutta scheme. The performance of both the codes was tested, and it was shown that preconditioning can improve convergence by a factor of two to a hundred depending on the problem. Our efforts are currently focused on evaluating the effect of chemical sources and on assessing how preconditioning may be applied to improve convergence and robustness in the calculation of reacting flows.

  19. Simulations with the COREDIV code of DEMO discharges

    NASA Astrophysics Data System (ADS)

    Zagórski, R.; Ivanova-Stanik, R. I.; Stankiewicz, R.

    2013-07-01

    The reduction in divertor target power load due to radiation of sputtered and externally seeded impurities in tokamak fusion reactors is investigated. The approach is based on integrated numerical modelling of DEMO discharges using the COREDIV code, which self-consistently solves 1D radial transport equations of plasma and impurities in the core region and 2D multifluid transport in the SOL. Calculations are performed for inductive DEMO scenarios and for DEMO steady-state configurations with tungsten walls and Ar or Ne seeding. For all considered DEMO scenarios significant fusion power can be achieved. Increase in seeded impurity influx leads to the reduction in fusion power and Q-factor (defined as the ratio of fusion power to auxiliary heating power) due to plasma dilution. Total radiation appears to be almost independent of the puffing level and is dominated by core radiation (>90%). The radiation due to seeding impurity is small and the type of seeded impurity weakly affects the results. For pulsed DEMO concepts, the accessible seeding level is limited. There is no steady-state solution for stronger puffing. The solution terminates due to helium accumulation, and if confirmed by more detailed investigations, might strongly affect DEMO design.

  20. Dual Coding, Reasoning and Fallacies.

    ERIC Educational Resources Information Center

    Hample, Dale

    1982-01-01

    Develops the theory that a fallacy is not a comparison of a rhetorical text to a set of definitions but a comparison of one person's cognition with another's. Reviews Paivio's dual coding theory, relates nonverbal coding to reasoning processes, and generates a limited fallacy theory based on dual coding theory. (PD)

  1. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  2. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  3. Energy Codes and Standards: Facilities

    SciTech Connect

    Bartlett, Rosemarie; Halverson, Mark A.; Shankle, Diana L.

    2007-01-01

    Energy codes and standards play a vital role in the marketplace by setting minimum requirements for energy-efficient design and construction. They outline uniform requirements for new buildings as well as additions and renovations. This article covers basic knowledge of codes and standards; development processes of each; adoption, implementation, and enforcement of energy codes and standards; and voluntary energy efficiency programs.

  4. Validation of the BEPLATE code

    SciTech Connect

    Giles, G.E.; Bullock, J.S.

    1997-11-01

    The electroforming simulation code BEPLATE (Boundary Element-PLATE) has been developed and validated for specific applications at Oak Ridge. New areas of application are opening up and more validations are being performed. This paper reports the validation experience of the BEPLATE code on two types of electroforms and describes some recent applications of the code.

  5. SAR image coding

    NASA Astrophysics Data System (ADS)

    Tourtier, P.

    1989-10-01

    The Synthetic Aperture Radar imagery causes a very large flow rate, to the extent that the data flow is at a record level. The image coding technique reduces the flow rate so that the original quality is preserved. This permits the reduction of the transmission channel capacity and improves the flow rate. A different technique is presented for data flow compression. The technique performs best at low cosine transform and is described in detail. The results obtained by Thomson-CSF show that a compression rate of the magnitude of 4 or 5 is possible without visible image degradation.

  6. Finite Element Analysis Code

    Energy Science and Technology Software Center (ESTSC)

    2005-05-07

    CONEX is a code for joining sequentially in time multiple exodusll database files which all represent the same base mesh topology and geometry. It is used to create a single results or restart file from multiple results or restart files which typically arise as the result of multiple restarted analyses. CONEX is used to postprocess the results from a series of finite element analyses. It can join sequentially the data from multiple results databases intomore » a single database which makes it easier to postprocess the results data.« less

  7. Finite Element Analysis Code

    Energy Science and Technology Software Center (ESTSC)

    2005-06-26

    Exotxt is an analysis code that reads finite element results data stored in an exodusII file and generates a file in a structured text format. The text file can be edited or modified via a number of text formatting tools. Exotxt is used by analysis to translate data from the binary exodusII format into a structured text format which can then be edited or modified and then either translated back to exodusII format or tomore » another format.« less

  8. Structured error recovery for code-word-stabilized quantum codes

    SciTech Connect

    Li Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P.

    2010-05-15

    Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3{sup t} times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

  9. Low Density Parity Check Codes: Bandwidth Efficient Channel Coding

    NASA Technical Reports Server (NTRS)

    Fong, Wai; Lin, Shu; Maki, Gary; Yeh, Pen-Shu

    2003-01-01

    Low Density Parity Check (LDPC) Codes provide near-Shannon Capacity performance for NASA Missions. These codes have high coding rates R=0.82 and 0.875 with moderate code lengths, n=4096 and 8176. Their decoders have inherently parallel structures which allows for high-speed implementation. Two codes based on Euclidean Geometry (EG) were selected for flight ASIC implementation. These codes are cyclic and quasi-cyclic in nature and therefore have a simple encoder structure. This results in power and size benefits. These codes also have a large minimum distance as much as d,,, = 65 giving them powerful error correcting capabilities and error floors less than lo- BER. This paper will present development of the LDPC flight encoder and decoder, its applications and status.

  10. Design of circular coded target and its application to optical 3D-measurement

    NASA Astrophysics Data System (ADS)

    Han, Jiandong; Lu, Naiguang; Dong, Mingli

    2008-12-01

    The coded target design is geometrically constructed only of circular elements, i.e. circular marked point, circular reference point and circular coding points. Marked point is surrounded by a reference point and some coding points with bit position at equally spaced angular interval. The circular radius of the reference point is bigger than that of coding points. Marked point represents the point location itself. Reference point provides a start bit for coding points. Coding points provide robust identification of the target anti-clockwise, starting from reference point. The design method provides a sufficient number of identification points by introducing a reference point. Finally, the application of the proposed coded targets to 3D data registration is described. Experimental results show that the developed coded targets are independent to location, rotation and change of scale, and the marked points are easily and accurately detected.

  11. Independent Component Analysis of Textures

    NASA Technical Reports Server (NTRS)

    Manduchi, Roberto; Portilla, Javier

    2000-01-01

    A common method for texture representation is to use the marginal probability densities over the outputs of a set of multi-orientation, multi-scale filters as a description of the texture. We propose a technique, based on Independent Components Analysis, for choosing the set of filters that yield the most informative marginals, meaning that the product over the marginals most closely approximates the joint probability density function of the filter outputs. The algorithm is implemented using a steerable filter space. Experiments involving both texture classification and synthesis show that compared to Principal Components Analysis, ICA provides superior performance for modeling of natural and synthetic textures.

  12. Code-Mixing as a Bilingual Instructional Strategy

    ERIC Educational Resources Information Center

    Jiang, Yih-Lin Belinda; García, Georgia Earnest; Willis, Arlette Ingram

    2014-01-01

    This study investigated code-mixing practices, specifically the use of L2 (English) in an L1 (Chinese) class in a U.S. bilingual program. Our findings indicate that the code-mixing practices made and prompted by the teacher served five pedagogical functions: (a) to enhance students' bilingualism and bilingual learning, (b) to review and…

  13. IGB grid: User's manual (A turbomachinery grid generation code)

    NASA Technical Reports Server (NTRS)

    Beach, T. A.; Hoffman, G.

    1992-01-01

    A grid generation code called IGB is presented for use in computational investigations of turbomachinery flowfields. It contains a combination of algebraic and elliptic techniques coded for use on an interactive graphics workstation. The instructions for use and a test case are included.

  14. 32 CFR 634.25 - Installation traffic codes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 4 2011-07-01 2011-07-01 false Installation traffic codes. 634.25 Section 634.25 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) LAW ENFORCEMENT AND CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION Traffic Supervision § 634.25 Installation traffic codes. (a) Installation...

  15. Performance Analysis of Optical Code Division Multiplex System

    NASA Astrophysics Data System (ADS)

    Kaur, Sandeep; Bhatia, Kamaljit Singh

    2013-12-01

    This paper presents the Pseudo-Orthogonal Code generator for Optical Code Division Multiple Access (OCDMA) system which helps to reduce the need of bandwidth expansion and improve spectral efficiency. In this paper we investigate the performance of multi-user OCDMA system to achieve data rate more than 1 Tbit/s.

  16. The Role of Code-Switching in Bilingual Creativity

    ERIC Educational Resources Information Center

    Kharkhurin, Anatoliy V.; Wei, Li

    2015-01-01

    This study further explores the theme of bilingual creativity with the present focus on code-switching. Specifically, it investigates whether code-switching practice has an impact on creativity. In line with the previous research, selective attention was proposed as a potential cognitive mechanism, which on the one hand would benefit from

  17. Automatic Coding of Dialogue Acts in Collaboration Protocols

    ERIC Educational Resources Information Center

    Erkens, Gijsbert; Janssen, Jeroen

    2008-01-01

    Although protocol analysis can be an important tool for researchers to investigate the process of collaboration and communication, the use of this method of analysis can be time consuming. Hence, an automatic coding procedure for coding dialogue acts was developed. This procedure helps to determine the communicative function of messages in online…

  18. Characterizing Mathematics Classroom Practice: Impact of Observation and Coding Choices

    ERIC Educational Resources Information Center

    Ing, Marsha; Webb, Noreen M.

    2012-01-01

    Large-scale observational measures of classroom practice increasingly focus on opportunities for student participation as an indicator of instructional quality. Each observational measure necessitates making design and coding choices on how to best measure student participation. This study investigated variations of coding approaches that may be…

  19. The Role of Code-Switching in Bilingual Creativity

    ERIC Educational Resources Information Center

    Kharkhurin, Anatoliy V.; Wei, Li

    2015-01-01

    This study further explores the theme of bilingual creativity with the present focus on code-switching. Specifically, it investigates whether code-switching practice has an impact on creativity. In line with the previous research, selective attention was proposed as a potential cognitive mechanism, which on the one hand would benefit from…

  20. 32 CFR 636.11 - Installation traffic codes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 4 2014-07-01 2013-07-01 true Installation traffic codes. 636.11 Section 636.11... CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION (SPECIFIC INSTALLATIONS) Fort Stewart, Georgia § 636.11 Installation traffic codes. In addition to the requirements in § 634.25(d) of this...