These are representative sample records from related to your search topic.
For comprehensive and current results, perform a real-time search at

Independent peer review of nuclear safety computer codes  

Microsoft Academic Search

A structured, independent computer code peer-review process has been developed to assist the US Nuclear Regulatory Commission (NRC) and the US Department of Energy in their nuclear safety missions. This paper describes a structured process of independent code peer review, benefits associated with a code-independent peer review, as well as the authors' recent peer-review experience. The NRC adheres to the

B. E. Boyack; R. P. Jenks



Independent peer review of nuclear safety computer codes  

SciTech Connect

A structured process of independent computer code peer review has been developed to assist the US Nuclear Regulatory Commission (NRC) and the US Department of Energy in their nuclear safety missions. This paper focuses on the process that evolved during recent reviews of NRC codes.

Boyack, B.E.; Jenks, R.P.



Independent peer review of nuclear safety computer codes  

SciTech Connect

A structured process of independent computer code peer review has been developed to assist the US Nuclear Regulatory Commission (NRC) and the US Department of Energy in their nuclear safety missions. This paper focuses on the process that evolved during recent reviews of NRC codes.

Boyack, B.E.; Jenks, R.P.



Getting Students to be Successful, Independent Investigators  

NSDL National Science Digital Library

Middle school students often struggle when writing testable problems, planning valid and reliable procedures, and drawing meaningful evidence-based conclusions. To address this issue, the author created a student-centered lab handout to facilitate the inquiry process for students. This handout has reduced studentsâ frustration and helped them become more independent and successful investigators.

Thomas, Jeffery D.



Independent Coding of Wind Direction in Cockroach Giant Interneurons  

E-print Network

Independent Coding of Wind Direction in Cockroach Giant Interneurons ADI MIZRAHI AND FREDERIC are located in the most posterior ganglion of the nerve direction in cockroach giant interneurons. J­3 ) control the stimulus by the cockroach cercal system. Such sensory processing initiation of a highly

Libersat, Frederic


Benchmark testing and independent verification of the VS2DT computer code  

SciTech Connect

The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code`s original documentation.

McCord, J.T. [Sandia National Labs., Albuquerque, NM (United States). Environmental Risk Assessment and Risk Management Dept.; Goodrich, M.T. [IT Corp., Albuquerque, NM (United States)



Independent Population Coding of Speech with Sub-Millisecond Precision  

PubMed Central

To understand the strategies used by the brain to analyze complex environments, we must first characterize how the features of sensory stimuli are encoded in the spiking of neuronal populations. Characterizing a population code requires identifying the temporal precision of spiking and the extent to which spiking is correlated, both between cells and over time. In this study, we characterize the population code for speech in the gerbil inferior colliculus (IC), the hub of the auditory system where inputs from parallel brainstem pathways are integrated for transmission to the cortex. We find that IC spike trains can carry information about speech with sub-millisecond precision, and, consequently, that the temporal correlations imposed by refractoriness can play a significant role in shaping spike patterns. We also find that, in contrast to most other brain areas, the noise correlations between IC cells are extremely weak, indicating that spiking in the population is conditionally independent. These results demonstrate that the problem of understanding the population coding of speech can be reduced to the problem of understanding the stimulus-driven spiking of individual cells, suggesting that a comprehensive model of the subcortical processing of speech may be attainable in the near future. PMID:24305831

Garcia-Lazaro, Jose A.; Belliveau, Lucile A. C.



Error minimization and coding triplet/binding site associations are independent features of the canonical genetic code.  


The canonical genetic code has been reported both to be error minimizing and to show stereochemical associations between coding triplets and binding sites. In order to test whether these two properties are unexpectedly overlapping, we generated 200,000 randomized genetic codes using each of five randomization schemes, with and without randomization of stop codons. Comparison of the code error (difference in polar requirement for single-nucleotide codon interchanges) with the coding triplet concentrations in RNA binding sites for eight amino acids shows that these properties are independent and uncorrelated. Thus, one is not the result of the other, and error minimization and triplet associations probably arose independently during the history of the genetic code. We explicitly show that prior fixation of a stereochemical core is consistent with an effective later minimization of error. PMID:16211428

Caporaso, J Gregory; Yarus, Michael; Knight, Rob



Pcigale: Porting Code Investigating Galaxy Emission to Python  

NASA Astrophysics Data System (ADS)

We present pcigale, the port to Python of CIGALE (Code Investigating Galaxy Emission) a Fortran spectral energy distribution (SED) fitting code developed at the Laboratoire d'Astrophysique de Marseille. After recalling the specifics of the SED fitting method, we show the gains in modularity and versatility offered by Python, as well as the drawbacks compared to the compiled code.

Roehlly, Y.; Burgarella, D.; Buat, V.; Boquien, M.; Ciesla, L.; Heinis, S.



Two independent transcription initiation codes overlap on vertebrate core promoters  

NASA Astrophysics Data System (ADS)

A core promoter is a stretch of DNA surrounding the transcription start site (TSS) that integrates regulatory inputs and recruits general transcription factors to initiate transcription. The nature and causative relationship of the DNA sequence and chromatin signals that govern the selection of most TSSs by RNA polymerase II remain unresolved. Maternal to zygotic transition represents the most marked change of the transcriptome repertoire in the vertebrate life cycle. Early embryonic development in zebrafish is characterized by a series of transcriptionally silent cell cycles regulated by inherited maternal gene products: zygotic genome activation commences at the tenth cell cycle, marking the mid-blastula transition. This transition provides a unique opportunity to study the rules of TSS selection and the hierarchy of events linking transcription initiation with key chromatin modifications. We analysed TSS usage during zebrafish early embryonic development at high resolution using cap analysis of gene expression, and determined the positions of H3K4me3-marked promoter-associated nucleosomes. Here we show that the transition from the maternal to zygotic transcriptome is characterized by a switch between two fundamentally different modes of defining transcription initiation, which drive the dynamic change of TSS usage and promoter shape. A maternal-specific TSS selection, which requires an A/T-rich (W-box) motif, is replaced with a zygotic TSS selection grammar characterized by broader patterns of dinucleotide enrichments, precisely aligned with the first downstream (+1) nucleosome. The developmental dynamics of the H3K4me3-marked nucleosomes reveal their DNA-sequence-associated positioning at promoters before zygotic transcription and subsequent transcription-independent adjustment to the final position downstream of the zygotic TSS. The two TSS-defining grammars coexist, often physically overlapping, in core promoters of constitutively expressed genes to enable their expression in the two regulatory environments. The dissection of overlapping core promoter determinants represents a framework for future studies of promoter structure and function across different regulatory contexts.

Haberle, Vanja; Li, Nan; Hadzhiev, Yavor; Plessy, Charles; Previti, Christopher; Nepal, Chirag; Gehrig, Jochen; Dong, Xianjun; Akalin, Altuna; Suzuki, Ana Maria; van Ijcken, Wilfred F. J.; Armant, Olivier; Ferg, Marco; Strähle, Uwe; Carninci, Piero; Müller, Ferenc; Lenhard, Boris



Independent assessment of the steady state fuel rod analysis code FRAPCON-1  

SciTech Connect

The predictive capabilities of the steady state fuel rod behavior program, FRAPCON-1, have been independently assessed. FRAPCON-1 code predictions of fuel behavior are compared with experimental data for test rods and with predictions from the FRAP-S3 code for commercial design rods. The code-to-data comparisons are used to assess the accuracy of fuel rod thermal, pressure, deformation, and corrosion models under steady state operating conditions. The code-to-code comparisons are used to identify the effects of model differences between FRAPCON-1 and the previously assessed fuel behavior code, FRAP-S3. On the basis of results of these studies, conclusions are given regarding present model capabilities and future development needs.

Laats, E.T.; Peeler, G.B.; Scofield, N.R.



Independent assessment of the steady state fuel rod analysis code FRAPCON-1  

SciTech Connect

The predictive capabilities of the steady state fuel rod behavior program, FRAPCON-1, have been independently assessed. FRAPCON-1 code predictions of fuel behavior are compared with experimental data for test rods and with predictions from the FRAP-S3 code for commercial design rods. The code-to-data comparisons are used to assess the accuracy of fuel rod thermal, pressure, deformation, and corrosion models under steady state operating conditions. The code-to-code comparisons are used to identify the effects of model differences between FRAPCON-1 and the previously assessed fuel behavior code, FRAP-S3. On the basis of results of these studies, conclusions are given regarding present model capabilities and future development needs.

Laats, E.T.; Peeler, G.B.; Scofield, N.R.



RBMK coupled neutronics/thermal-hydraulics analyses by two independent code systems  

SciTech Connect

This paper presents the coupled neutronics/thermal-hydraulics activities carried out in the framework of the part B of the TACIS project R2.03/97, 'Software development for accident analysis of RBMK reactors in Russia'. Two independent code systems were assembled, one from the Russian side and the other from the Western side, for studying RBMK core transients. The Russian code system relies on the use of code UNK for neutron data libraries generation and the three-dimensional neutron kinetics thermal-hydraulics coupled codes BARS-KORSAR for plant transient analyses. The Western code system is instead based on the lattice physics code HELIOS and on the RELAP5-3D C code. Several activities were performed for testing code system's capabilities: the neutron data libraries were calculated and verified by precise Monte Carlo calculations, the coupled codes' steady state results were compared with plant detectors' data, and calculations of several transients were compared. Finally, both code systems proved to have all the capabilities for addressing reliable safety analyses of RBMK reactors. (authors)

Parisi, C.; D'Auria, F. [Univ. of Pisa, Dept. of Mechanical, Nuclear and Production Engineering, via Diotisalvi, 2, 56100 Pisa (Italy); Malofeev, V. [Kurchatov Inst., Kurchatov Square 1, Moscow 123182 (Russian Federation); Ivanov, B.; Ivanov, K. [Pennsylvania State Univ., RDFMG, 230 Reber Building, Univ. Park, PA 16802 (United States)



An investigation of coded aperture imaging for small animal SPECT  

Microsoft Academic Search

Coded apertures provide a substantial gain in detection efficiency compared with conventional collimation and are well suited to imaging small volumes. Here, the authors investigated several aspects of coded aperture design for a small animal SPECT system, including aperture\\/detector configuration, sampling requirements, and susceptibility to scatter. They simulated various source distributions and detection systems which included one, two, and four

Steven R. Meikle; Roger R. Fulton; Stefan Eberl; Magnus Dahlbom; Koon-Pong Wong; Michael J. Fulham



PWS: an efficient code system for solving space-independent nuclear reactor dynamics  

Microsoft Academic Search

The reactor kinetics equations are reduced to a differential equation in matrix form convenient for explicit power series solution involving no approximations beyond the usual space-independent assumption. The coefficients of the series have been obtained from a straightforward recurrence relation. Numerical evaluation is performed by PWS (power series solution) code, written in Visual FORTRAN for a personal computer. The results

A. E Aboanber; Y. M Hamada



Independent verification and validation testing of the FLASH computer code, Versiion 3.0  

SciTech Connect

Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies.

Martian, P.; Chung, J.N. [Washington State Univ., Pullman, WA (United States). Dept. of Mechanical and Materials Engineering



Independent verification and validation testing of the FLASH computer code, Versiion 3. 0  

SciTech Connect

Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies.

Martian, P.; Chung, J.N. (Washington State Univ., Pullman, WA (United States). Dept. of Mechanical and Materials Engineering)



Board Governance of Independent Schools: A Framework for Investigation  

ERIC Educational Resources Information Center

Purpose: This paper develops a theoretical framework to guide future inquiry into board governance of independent schools. Design/methodology/approach: The authors' approach is to integrate literatures related to corporate and educational boards, motivation, leadership and group processes that are appropriate for conceptualizing independent school…

McCormick, John; Barnett, Kerry; Alavi, Seyyed Babak; Newcombe, Geoffrey



Hundreds of conserved non-coding genomic regions are independently lost in mammals  

PubMed Central

Conserved non-protein-coding DNA elements (CNEs) often encode cis-regulatory elements and are rarely lost during evolution. However, CNE losses that do occur can be associated with phenotypic changes, exemplified by pelvic spine loss in sticklebacks. Using a computational strategy to detect complete loss of CNEs in mammalian genomes while strictly controlling for artifacts, we find >600 CNEs that are independently lost in at least two mammalian lineages, including a spinal cord enhancer near GDF11. We observed several genomic regions where multiple independent CNE loss events happened; the most extreme is the DIAPH2 locus. We show that CNE losses often involve deletions and that CNE loss frequencies are non-uniform. Similar to less pleiotropic enhancers, we find that independently lost CNEs are shorter, slightly less constrained and evolutionarily younger than CNEs without detected losses. This suggests that independently lost CNEs are less pleiotropic and that pleiotropic constraints contribute to non-uniform CNE loss frequencies. We also detected 35 CNEs that are independently lost in the human lineage and in other mammals. Our study uncovers an interesting aspect of the evolution of functional DNA in mammalian genomes. Experiments are necessary to test if these independently lost CNEs are associated with parallel phenotype changes in mammals. PMID:23042682

Hiller, Michael; Schaar, Bruce T.; Bejerano, Gill



Hundreds of conserved non-coding genomic regions are independently lost in mammals.  


Conserved non-protein-coding DNA elements (CNEs) often encode cis-regulatory elements and are rarely lost during evolution. However, CNE losses that do occur can be associated with phenotypic changes, exemplified by pelvic spine loss in sticklebacks. Using a computational strategy to detect complete loss of CNEs in mammalian genomes while strictly controlling for artifacts, we find >600 CNEs that are independently lost in at least two mammalian lineages, including a spinal cord enhancer near GDF11. We observed several genomic regions where multiple independent CNE loss events happened; the most extreme is the DIAPH2 locus. We show that CNE losses often involve deletions and that CNE loss frequencies are non-uniform. Similar to less pleiotropic enhancers, we find that independently lost CNEs are shorter, slightly less constrained and evolutionarily younger than CNEs without detected losses. This suggests that independently lost CNEs are less pleiotropic and that pleiotropic constraints contribute to non-uniform CNE loss frequencies. We also detected 35 CNEs that are independently lost in the human lineage and in other mammals. Our study uncovers an interesting aspect of the evolution of functional DNA in mammalian genomes. Experiments are necessary to test if these independently lost CNEs are associated with parallel phenotype changes in mammals. PMID:23042682

Hiller, Michael; Schaar, Bruce T; Bejerano, Gill



Proof-of-principle experiment of reference-frame-independent quantum key distribution with phase coding  

E-print Network

We have demonstrated a proof-of-principle experiment of reference-frame-independent phase coding quantum key distribution (RFI-QKD) over an 80-km optical fiber. After considering the finite-key bound, we still achieve a distance of 50 km. In this scenario, the phases of the basis states are related by a slowly time-varying transformation. Furthermore, we developed and realized a new decoy state method for RFI-QKD systems with weak coherent sources to counteract the photon-number-splitting attack. With the help of a reference-frame-independent protocol and a Michelson interferometer with Faraday rotator mirrors, our system is rendered immune to the slow phase changes of the interferometer and the polarization disturbances of the channel, making the procedure very robust.

Wen-Ye Liang; Shuang Wang; Hong-Wei Li; Zhen-Qiang Yin; Wei Chen; Yao Yao; Jing-Zheng Huang; Guang-Can Guo; Zheng-Fu Han



Proof-of-principle experiment of reference-frame-independent quantum key distribution with phase coding  

PubMed Central

We have demonstrated a proof-of-principle experiment of reference-frame-independent phase coding quantum key distribution (RFI-QKD) over an 80-km optical fiber. After considering the finite-key bound, we still achieve a distance of 50?km. In this scenario, the phases of the basis states are related by a slowly time-varying transformation. Furthermore, we developed and realized a new decoy state method for RFI-QKD systems with weak coherent sources to counteract the photon-number-splitting attack. With the help of a reference-frame-independent protocol and a Michelson interferometer with Faraday rotator mirrors, our system is rendered immune to the slow phase changes of the interferometer and the polarization disturbances of the channel, making the procedure very robust. PMID:24402550

Liang, Wen-Ye; Wang, Shuang; Li, Hong-Wei; Yin, Zhen-Qiang; Chen, Wei; Yao, Yao; Huang, Jing-Zheng; Guo, Guang-Can; Han, Zheng-Fu




ERIC Educational Resources Information Center

Discusses the four planes of development and the periods of creation and crystallization within each plane. Identifies the type of independence that should be achieved by the end of the first two planes of development. Maintains that it is through individual work on the environment that one achieves independence. (KB)

Stephenson, Margaret E.



The investigation of bandwidth efficient coding and modulation techniques  

NASA Technical Reports Server (NTRS)

The New Mexico State University Center for Space Telemetering and Telecommunications systems has been, and is currently, engaged in the investigation of trellis-coded modulation (TCM) communication systems. In particular, TCM utilizing M-ary phase shift keying is being studied. The study of carrier synchronization in a TCM environment, or in MPSK systems in general, has been one of the two main thrusts of this grant. This study has involved both theoretical modelling and software simulation of the carrier synchronization problem.



Field Dependence/Independence Cognitive Style and Problem Posing: An Investigation with Sixth Grade Students  

ERIC Educational Resources Information Center

Field dependence/independence cognitive style was found to relate to general academic achievement and specific areas of mathematics; in the majority of studies, field-independent students were found to be superior to field-dependent students. The present study investigated the relationship between field dependence/independence cognitive style and…

Nicolaou, Aristoklis Andreas; Xistouri, Xenia



Investigating Lossy Image Coding Using the PLHaar Transform  

SciTech Connect

We developed the Piecewise-Linear Haar (PLHaar) transform, an integer wavelet-like transform. PLHaar does not have dynamic range expansion, i.e. it is an n-bit to n-bit transform. To our knowledge PLHaar is the only reversible n-bit to n-bit transform that is suitable for lossy and lossless coding. We are investigating PLHaar's use in lossy image coding. Preliminary results from thresholding transform coefficients show that PLHaar does not produce objectionable artifacts like prior n-bit to n-bit transforms, such as the transform of Chao et al. (CFH). Also, at lower bitrates PLHaar images have increased contrast. For a given set of CFH and PLHaar coefficients with equal entropy, the PLHaar reconstruction is more appealing, although the PSNR may be lower.

Senecal, J G; Lindstrom, P; Duchaineau, M A; Joy, K I



Retrovirus vector silencing is de novo methylase independent and marked by a repressive histone code  

PubMed Central

Retrovirus vectors are de novo methylated and transcriptionally silent in mammalian stem cells. Here, we identify epigenetic modifications that mark retrovirus-silenced transgenes. We show that murine stem cell virus (MSCV) and human immunodeficiency virus type 1 (HIV-1) vectors dominantly silence a linked locus control region (LCR) ?-globin reporter gene in transgenic mice. MSCV silencing blocks LCR hypersensitive site formation, and silent transgene chromatin is marked differentially by a histone code composed of abundant linker histone H1, deacetylated H3 and acetylated H4. Retrovirus-transduced embryonic stem (ES) cells are silenced predominantly 3 days post-infection, with a small subset expressing enhanced green fluorescent protein to low levels, and silencing is not relieved in de novo methylase-null [dnmt3a–/–;dnmt3b–/–] ES cells. MSCV and HIV-1 sequences also repress reporter transgene expression in Drosophila, demonstrating establishment of silencing in the absence of de novo and maintenance methylases. These findings provide mechanistic insight into a conserved gene silencing mechanism that is de novo methylase independent and that epigenetically marks retrovirus chromatin with a repressive histone code. PMID:11060039

Pannell, Dylan; Osborne, Cameron S.; Yao, Shuyuan; Sukonnik, Tanya; Pasceri, Peter; Karaiskakis, Angelo; Okano, Masaki; Li, En; Lipshitz, Howard D.; Ellis, James



RELAP5/MOD3 code manual: Summaries and reviews of independent code assessment reports. Volume 7, Revision 1  

SciTech Connect

Summaries of RELAP5/MOD3 code assessments, a listing of the assessment matrix, and a chronology of the various versions of the code are given. Results from these code assessments have been used to formulate a compilation of some of the strengths and weaknesses of the code. These results are documented in the report. Volume 7 was designed to be updated periodically and to include the results of the latest code assessments as they become available. Consequently, users of Volume 7 should ensure that they have the latest revision available.

Moore, R.L.; Sloan, S.M.; Schultz, R.R.; Wilson, G.E. [Lockheed Idaho Technologies Co., Idaho Falls, ID (United States)



FRAP T-6: an independent code assessment based on LOCA Simulation Test MT-1 in the NRU reactor. [PWR  

SciTech Connect

The Fuel Rod Analysis Program - Transient FRAP-T6 was independently assessed through comparisons with experimental data obtained from a Loss-of-Coolant Accident (LOCA) simulation test performed in the National Research Universal (NRU) reactor. A concise computer code description is given and computer code calculations are compared with experimental data from materials deformation test MT-1. Results of these comparison are discussed for different boundary conditions and different mathematical models which describe the physical processes in the fuel rod.

VanderKaa, T.



An investigation of error characteristics and coding performance  

NASA Technical Reports Server (NTRS)

The performance of forward error correcting coding schemes on errors anticipated for the Earth Observation System (EOS) Ku-band downlink are studied. The EOS transmits picture frame data to the ground via the Telemetry Data Relay Satellite System (TDRSS) to a ground-based receiver at White Sands. Due to unintentional RF interference from other systems operating in the Ku band, the noise at the receiver is non-Gaussian which may result in non-random errors output by the demodulator. That is, the downlink channel cannot be modeled by a simple memoryless Gaussian-noise channel. From previous experience, it is believed that those errors are bursty. The research proceeded by developing a computer based simulation, called Communication Link Error ANalysis (CLEAN), to model the downlink errors, forward error correcting schemes, and interleavers used with TDRSS. To date, the bulk of CLEAN was written, documented, debugged, and verified. The procedures for utilizing CLEAN to investigate code performance were established and are discussed.

Ebel, William J.; Ingels, Frank M.



Investigation of Navier-Stokes Code Verification and Design Optimization  

NASA Technical Reports Server (NTRS)

With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization study is carried out using a geometric mean approach. Following this, sensitivity analyses with the aid of variance-based non-parametric approach and partial correlation coefficients are conducted using data available from surrogate models of the objectives and the multi-objective optima to identify the contribution of the design variables to the objective variability and to analyze the variability of the design variables and the objectives. In summary the present dissertation offers insight into an improved coarse to fine grid extrapolation technique for Navier-Stokes computations and also suggests tools for a designer to conduct design optimization study and related sensitivity analyses for a given design problem.

Vaidyanathan, Rajkumar



Investigation of a panel code for airframe/propeller integration analyses  

NASA Technical Reports Server (NTRS)

The Hess panel code was investigated as a procedure to predict the aerodynamic loading associated with propeller slipstream interference on the airframe. The slipstream was modeled as a variable onset flow to the lifting and nonlifting bodies treated by the code. Four sets of experimental data were used for comparisons with the code. The results indicate that the Hess code, in its present form, will give valid solutions for nonuniform onset flows which vary in direction only. The code presently gives incorrect solutions for flows with variations in velocity. Modifications to the code to correct this are discussed.

Miley, S. J.



Investigation of Error Concealment Using Different Transform Codings and Multiple Description Codings  

NASA Astrophysics Data System (ADS)

There has been increasing usage of Multiple Description Coding (MDC) for error concealment in non-ideal channels. A lot of ideas have been masterminded for MDC method up to now. This paper described the attempts to conceal the error and reconstruct the lost descriptions caused by combining MDC and lapped orthogonal transform (LOT). In this work LOT and other transforms codings (DCT and wavelet) are used to decorrelate the image pixels in the transform domain. LOT has better performance at low bit rates in comparison to DCT and wavelet transform. The results show that MSE for the proposed methods in comparison to DCT and wavelet have decreased significantly. The PSNR values of reconstructed images are high. The subjective evaluation of image is very good and clear. Furthermore, the standard deviations of reconstructed images are very small especially in low capacity channels.

Farzamnia, Ali; Syed-Yusof, Sharifah K.; Fisal, Norsheila; Abu-Bakar, Syed A. R.



Investigation of Bandwidth-Efficient Coding and Modulation Techniques  

NASA Technical Reports Server (NTRS)

The necessary technology was studied to improve the bandwidth efficiency of the space-to-ground communications network using the current capabilities of that network as a baseline. The study was aimed at making space payloads, for example the Hubble Space Telescope, more capable without the need to completely redesign the link. Particular emphasis was placed on the following concepts: (1) what the requirements are which are necessary to convert an existing standard 4-ary phase shift keying communications link to one that can support, as a minimum, 8-ary phase shift keying with error corrections applied; and (2) to determine the feasibility of using the existing equipment configurations with additional signal processing equipment to realize the higher order modulation and coding schemes.

Osborne, William P.



ADF95: Tool for automatic differentiation of a FORTRAN code designed for large numbers of independent variables  

NASA Astrophysics Data System (ADS)

ADF95 is a tool to automatically calculate numerical first derivatives for any mathematical expression as a function of user defined independent variables. Accuracy of derivatives is achieved within machine precision. ADF95 may be applied to any FORTRAN 77/90/95 conforming code and requires minimal changes by the user. It provides a new derived data type that holds the value and derivatives and applies forward differencing by overloading all FORTRAN operators and intrinsic functions. An efficient indexing technique leads to a reduced memory usage and a substantially increased performance gain over other available tools with operator overloading. This gain is especially pronounced for sparse systems with large number of independent variables. A wide class of numerical simulations, e.g., those employing implicit solvers, can profit from ADF95. Program summaryTitle of program:ADF95 Catalogue identifier: ADVI Program summary URL: Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer for which the program is designed: all platforms with a FORTRAN 95 compiler Programming language used:FORTRAN 95 No. of lines in distributed program, including test data, etc.: 3103 No. of bytes in distributed program, including test data, etc.: 9862 Distribution format: tar.gz Nature of problem: In many areas in the computational sciences first order partial derivatives for large and complex sets of equations are needed with machine precision accuracy. For example, any implicit or semi-implicit solver requires the computation of the Jacobian matrix, which contains the first derivatives with respect to the independent variables. ADF95 is a software module to facilitate the automatic computation of the first partial derivatives of any arbitrarily complex mathematical FORTRAN expression. The program exploits the sparsity inherited by many set of equations thereby enabling faster computations compared to alternate differentiation tools Solution method: A class is constructed which applies the chain rule of differentiation to any FORTRAN expression, to compute the first derivatives by forward differencing. An efficient indexing technique leads to a reduced memory usage and a substantially increased performance gain when sparsity can be exploited. From a users point of view, only minimal changes to his/her original code are needed in order to compute the first derivatives of any expression in the code Restrictions: Processor and memory hardware may restrict both the possible number of independent variables and the computation time Unusual features:ADF95 can operate on user code that makes use of the array features introduced in FORTRAN 90. A convenient extraction subroutine for the Jacobian matrix is also provided Running time: In many realistic cases, the evaluation of the first order derivatives of a mathematical expression is only six times slower compared to the evaluation of analytically derived and hard-coded expressions. The actual factor depends on the underlying set of equations for which derivatives are to be calculated, the number of independent variables, the sparsity and on the FORTRAN 95 compiler

Straka, Christian W.



An investigation on English/Chinese Code-switching in BBS in Chinese Alumni's Community.  

E-print Network

??The study, based on Myers-Scotton’s revised Markedness Model, investigates English/Chinese code-switching in BBS in Chinese alumni’s community, aiming to prove people are rational calculator when… (more)

Ge, Luqun



Investigation of Large Eddy Simulation Code Scaling Performance and Network Type Influence on a Linux PC Cluster  

Microsoft Academic Search

The parallel performance of a large eddy simulation computational fluid dynamics code was evaluated on a Pentium III PC cluster with 64 processors running under Linux. Scaling capability of the code and comparative code performance under Fast Ethernet, Myrinet, and Gigabit Ethernet were investigated. The code shows very good scalability for a scaled speedup test with wall clock time remaining

Boris M. Chernyavsky; Doyle D. Knight



Investigation of the Use of Erasures in a Concatenated Coding Scheme  

NASA Technical Reports Server (NTRS)

A new method for declaring erasures in a concatenated coding scheme is investigated. This method is used with the rate 1/2 K = 7 convolutional code and the (255, 223) Reed Solomon code. Errors and erasures Reed Solomon decoding is used. The erasure method proposed uses a soft output Viterbi algorithm and information provided by decoded Reed Solomon codewords in a deinterleaving frame. The results show that a gain of 0.3 dB is possible using a minimum amount of decoding trials.

Kwatra, S. C.; Marriott, Philip J.



A model to investigate the mechanisms underlying the emergence and development of independent sitting.  


When infants first begin to sit independently, they are highly unstable and unable to maintain upright sitting posture for more than a few seconds. Over the course of 3 months, the sitting ability of infants drastically improves. To investigate the mechanisms controlling the development of sitting posture, a single-degree-of-freedom inverted pendulum model was developed. Passive muscle properties were modeled with a stiffness and damping term, while active neurological control was modeled with a time-delayed proportional-integral-derivative (PID) controller. The findings of the simulations suggest that infants primarily utilize passive muscle stiffness to remain upright when they first begin to sit. This passive control mechanism allows the infant to remain upright so that active feedback control mechanisms can develop. The emergence of active control mechanisms allows infants to integrate sensory information into their movements so that they can exhibit more adaptive sitting. PMID:25442426

O'Brien, Kathleen M; Zhang, Jing; Walley, Philip R; Rhoads, Jeffrey F; Haddad, Jeffrey M; Claxton, Laura J



ALS beamlines for independent investigators: A summary of the capabilities and characteristics of beamlines at the ALS  

SciTech Connect

There are two mods of conducting research at the ALS: To work as a member of a participating research team (PRT). To work as a member of a participating research team (PRT); to work as an independent investigator; PRTs are responsible for building beamlines, end stations, and, in some cases, insertion devices. Thus, PRT members have privileged access to the ALS. Independent investigators will use beamline facilities made available by PRTs. The purpose of this handbook is to describe these facilities.

Not Available



Flight Investigation of Prescribed Simultaneous Independent Surface Excitations for Real-Time Parameter Identification  

NASA Technical Reports Server (NTRS)

Near real-time stability and control derivative extraction is required to support flight demonstration of Intelligent Flight Control System (IFCS) concepts being developed by NASA, academia, and industry. Traditionally, flight maneuvers would be designed and flown to obtain stability and control derivative estimates using a postflight analysis technique. The goal of the IFCS concept is to be able to modify the control laws in real time for an aircraft that has been damaged in flight. In some IFCS implementations, real-time parameter identification (PID) of the stability and control derivatives of the damaged aircraft is necessary for successfully reconfiguring the control system. This report investigates the usefulness of Prescribed Simultaneous Independent Surface Excitations (PreSISE) to provide data for rapidly obtaining estimates of the stability and control derivatives. Flight test data were analyzed using both equation-error and output-error PID techniques. The equation-error PID technique is known as Fourier Transform Regression (FTR) and is a frequency-domain real-time implementation. Selected results were compared with a time-domain output-error technique. The real-time equation-error technique combined with the PreSISE maneuvers provided excellent derivative estimation in the longitudinal axis. However, the PreSISE maneuvers as presently defined were not adequate for accurate estimation of the lateral-directional derivatives.

Moes, Timothy R.; Smith, Mark S.; Morelli, Eugene A.



User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials  

NASA Technical Reports Server (NTRS)

The Finite Difference Time Domain Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). This manual provides a description of the code and corresponding results for the default scattering problem. In addition to the description, the operation, resource requirements, version A code capabilities, a description of each subroutine, a brief discussion of the radar cross section computations, and a discussion of the scattering results.

Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.




Microsoft Academic Search

\\u000a Numeric value assigned to textual data; e.g. for diagnoses: SNOMED (of the College of American Pathologists), ICD-9 c., ICD-10 c., read clinical classification for diagnoses, signs, symptoms and history; for adverse events: who-adverse reaction terminology\\/who-adverse reaction dictionary or FDA’s COSTART (Coding System for a Thesaurus of Adverse Reaction Terms); for coding medications or treatments resp.: who-drug Dictionary and drug reference

Gerhard Nahler


Investigation of aerodynamic characteristics of wings having vortex flow using different numerical codes  

NASA Technical Reports Server (NTRS)

The aerodynamic characteristics of highly sweptback wings having separation-induced vortex flow were investigated by employing different numerical codes with a view to determining some of the capabilities and limitations of these codes. Flat wings of various configurations-strake wing models, cropped, diamond, arrow and double delta wings, were studied. Cambered and cranked planforms have also been tested. The theoretical results predicted by the codes were compared with the experimental data, wherever possible, and found to agree favorably for most of the configurations investigated. However, large cambered wings could not be successfully modeled by the codes. It appears that the final solution in the free vortex sheet method is affected by the selection of the initial solution. Accumulated span loadings estimated for delta and diamond wings were found to be unusual in comparison with attached flow results in that the slopes of these load curves near the leading edge do not tend to infinity as they do in the case of attached flow.

Reddy, C. S.; Goglia, G. L.




PubMed Central

The study outlined in this article drew on Elijah Anderson’s (1999) code of the street perspective to examine the impact of neighborhood street culture on violent delinquency. Using data from more than 700 African American adolescents, we examined 1) whether neighborhood street culture predicts adolescent violence above and beyond an adolescent’s own street code values and 2) whether neighborhood street culture moderates individual-level street code values on adolescent violence. Consistent with Anderson’s hypotheses, neighborhood street culture significantly predicts violent delinquency independent of individual-level street code effects. Additionally, neighborhood street culture moderates individual-level street code values on violence in neighborhoods where the street culture is widespread. In particular, the effect of street code values on violence is enhanced in neighborhoods where the street culture is endorsed widely. PMID:21666759

Stewart, Eric A.; Simons, Ronald L.



User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials  

NASA Technical Reports Server (NTRS)

The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three-dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain (FDTD) technique. The supplied version of the code is one version of our current three-dimensional FDTD code set. The manual given here provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing radar cross section computations, a section discussing some scattering results, a new problem checklist, references, and figure titles.

Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.



User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials  

NASA Technical Reports Server (NTRS)

The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.



User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials  

NASA Technical Reports Server (NTRS)

The Penn State Finite Difference Time Domain (FDTD) Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain technique. The supplied version of the code is one version of our current three dimensional FDTD code set. The manual provides a description of the code and the corresponding results for the default scattering problem. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version A code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONA.FOR), a section briefly discussing radar cross section (RCS) computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.



After a Long-Term Placement: Investigating Educational Achievement, Behaviour, and Transition to Independent Living  

ERIC Educational Resources Information Center

This study describes the transition towards independent living of 123 former fostered young people reared for long periods in a private French organisation, SOS Children's Villages. Three generations of care leavers were analysed through a postal survey and interviews. Their narratives show typical pathways after leaving care. Two-thirds became…

Dumaret, Annick-Camille; Donati, Pascale; Crost, Monique



Culture-Dependent and Independent Methods To Investigate the Microbial Ecology of Italian Fermented Sausages  

Microsoft Academic Search

In this study, the microbial ecology of three naturally fermented sausages produced in northeast Italy was studied by culture-dependent and -independent methods. By plating analysis, the predominance of lactic acid bacteria populations was pointed out, as well as the importance of coagulase-negative cocci. Also in the case of one fermentation, the fecal enterocci reached significant counts, highlighting their contribution to

Kalliopi Rantsiou; Rosalinda Urso; Lucilla Iacumin; Carlo Cantoni; Patrizia Cattaneo; Giuseppe Comi; Luca Cocolin



Investigation of Cool and Hot Executive Function in ODD/CD Independently of ADHD  

ERIC Educational Resources Information Center

Background: Children with oppositional defiant disorder/conduct disorder (ODD/CD) have shown deficits in "cool" abstract-cognitive, and "hot" reward-related executive function (EF) tasks. However, it is currently unclear to what extent ODD/CD is associated with neuropsychological deficits, independently of attention deficit hyperactivity disorder…

Hobson, Christopher W.; Scott, Stephen; Rubia, Katya



An investigation of design optimization using a 2-D viscous flow code with multigrid  

NASA Technical Reports Server (NTRS)

Computational fluid dynamics (CFD) codes have advanced to the point where they are effective analytical tools for solving flow fields around complex geometries. There is also a need for their use as a design tool to find optimum aerodynamic shapes. In the area of design, however, a difficulty arises due to the large amount of computer resources required by these codes. It is desired to streamline the design process so that a large number of design options and constraints can be investigated without overloading the system. There are several techniques which have been proposed to help streamline the design process. The feasibility of one of these techniques is investigated. The technique under consideration is the interaction of the geometry change with the flow calculation. The problem of finding the value of camber which maximizes the ratio of lift over drag for a particular airfoil is considered. In order to test out this technique, a particular optimization problem was tried. A NACA 0012 airfoil was considered at free stream Mach number of 0.5 with a zero angle of attack. Camber was added to the mean line of the airfoil. The goal was to find the value of camber for which the ratio of lift over drag is a maximum. The flow code used was FLOMGE which is a two dimensional viscous flow solver which uses multigrid to speed up convergence. A hyperbolic grid generation program was used to construct the grid for each value of camber.

Doria, Michael L.



An investigation on the capabilities of the PENELOPE MC code in nanodosimetry  

SciTech Connect

The Monte Carlo (MC) method has been widely implemented in studies of radiation effects on human genetic material. Most of these works have used specific-purpose MC codes to simulate radiation transport in condensed media. PENELOPE is one of the general-purpose MC codes that has been used in many applications related to radiation dosimetry. Based on the fact that PENELOPE can carry out event-by-event coupled electron-photon transport simulations following these particles down to energies of the order of few tens of eV, we have decided to investigate the capacities of this code in the field of nanodosimetry. Single and double strand break probabilities due to the direct impact of {gamma} rays originated from Co{sup 60} and Cs{sup 137} isotopes and characteristic x-rays, from Al and C K-shells, have been determined by use of PENELOPE. Indirect damage has not been accounted for in this study. A human genetic material geometrical model has been developed, taking into account five organizational levels. In an article by Friedland et al. [Radiat. Environ. Biophys. 38, 39-47 (1999)], a specific-purpose MC code and a very sophisticated DNA geometrical model were used. We have chosen that work as a reference to compare our results. Single and double strand-break probabilities obtained here underestimate those reported by Friedland and co-workers by 20%-76% and 50%-60%, respectively. However, we obtain RBE values for Cs{sup 137}, Al{sub K} and C{sub K} radiations in agreement with those reported in previous works [Radiat. Environ. Biophys. 38, 39-47 (1999)] and [Phys. Med. Biol. 53, 233-244 (2008)]. Some enhancements can be incorporated into the PENELOPE code to improve its results in the nanodosimetry field.

Bernal, M. A.; Liendo, J. A. [Departamento de Fisica, Universidad Simon Bolivar, P.O. Box 89000, Caracas (Venezuela, Bolivarian Republic of)



Investigation of in-band transmission of both spectral amplitude coding/optical code division multiple-access and wavelength division multiplexing signals  

NASA Astrophysics Data System (ADS)

The transmission of both optical code division multiple-access (OCDMA) and wavelength division multiplexing (WDM) users on the same band is investigated. Code pulses of spectral amplitude coding (SAC)/optical code division multiple-access (CDMA) are overlaid onto a multichannel WDM system. Notch filters are utilized in order to suppress the WDM interference signals for detection of optical broadband CDMA signals. Modified quadratic congruence (MQC) codes are used as the signature codes for the SAC/OCDMA system. The proposed system is simulated and its performance in terms of both the bit-error rate and Q-factor are determined. In addition, eavesdropper probability of error-free code detection is evaluated. Our results are compared to traditional nonhybrid systems. It is concluded that the proposed hybrid scheme still achieves acceptable performance. In addition, it provides enhanced data confidentiality as compared to the scheme with SAC/OCDMA only. It is also shown that the performance of the proposed system is limited by the interference of the WDM signals. Furthermore, the simulation illustrates the tradeoff between the performance and confidentiality for authorized users.

Ashour, Isaac A. M.; Shaari, Sahbudin; Shalaby, Hossam M. H.; Menon, P. Susthitha



Culture-Dependent and -Independent Investigations of Microbial Diversity on Urinary Catheters  

PubMed Central

Catheter-associated urinary tract infection is caused by bacteria, which ascend the catheter along its external or internal surface to the bladder and subsequently develop into biofilms on the catheter and uroepithelium. Antibiotic-treated bacteria and bacteria residing in biofilm can be difficult to culture. In this study we used culture-based and 16S rRNA gene-based culture-independent methods (fingerprinting, cloning, and pyrosequencing) to determine the microbial diversity of biofilms on 24 urinary catheters. Most of the patients were catheterized for <30 days and had undergone recent antibiotic treatment. In addition, the corresponding urine samples for 16 patients were cultured. We found that gene analyses of the catheters were consistent with cultures of the corresponding urine samples for the presence of bacteria but sometimes discordant for the identity of the species. Cultures of catheter tips detected bacteria more frequently than urine cultures and gene analyses; coagulase-negative staphylococci were, in particular, cultured much more often from catheter tips, indicating potential contamination of the catheter tips during sampling. The external and internal surfaces of 19 catheters were separately analyzed by molecular methods, and discordant results were found in six catheters, suggesting that bacterial colonization intra- and extraluminally may be different. Molecular analyses showed that most of the species identified in this study were known uropathogens, and infected catheters were generally colonized by one to two species, probably due to antibiotic usage and short-term catheterization. In conclusion, our data showed that culture-independent molecular methods did not detect bacteria from urinary catheters more frequently than culture-based methods. PMID:23015674

Xu, Yijuan; Moser, Claus; Al-Soud, Waleed Abu; Sørensen, Søren; Høiby, Niels; Nielsen, Per Halkjær



Culture-Dependent and -Independent Methods To Investigate the Microbial Ecology of Italian Fermented Sausages  

PubMed Central

In this study, the microbial ecology of three naturally fermented sausages produced in northeast Italy was studied by culture-dependent and -independent methods. By plating analysis, the predominance of lactic acid bacteria populations was pointed out, as well as the importance of coagulase-negative cocci. Also in the case of one fermentation, the fecal enterocci reached significant counts, highlighting their contribution to the particular transformation process. Yeast counts were higher than the detection limit (>100 CFU/g) in only one fermented sausage. Analysis of the denaturing gradient gel electrophoresis (DGGE) patterns and sequencing of the bands allowed profiling of the microbial populations present in the sausages during fermentation. The bacterial ecology was mainly characterized by the stable presence of Lactobacillus curvatus and Lactobacillus sakei, but Lactobacillus paracasei was also repeatedly detected. An important piece of evidence was the presence of Lactococcus garvieae, which clearly contributed in two fermentations. Several species of Staphylococcus were also detected. Regarding other bacterial groups, Bacillus sp., Ruminococcus sp., and Macrococcus caseolyticus were also identified at the beginning of the transformations. In addition, yeast species belonging to Debaryomyces hansenii, several Candida species, and Willopsis saturnus were observed in the DGGE gels. Finally, cluster analysis of the bacterial and yeast DGGE profiles highlighted the uniqueness of the fermentation processes studied. PMID:15812029

Rantsiou, Kalliopi; Urso, Rosalinda; Iacumin, Lucilla; Cantoni, Carlo; Cattaneo, Patrizia; Comi, Giuseppe; Cocolin, Luca



Culture-Independent Investigation of the Microbiome Associated with the Nematode Acrobeloides maximus  

PubMed Central

Background Symbioses between metazoans and microbes are widespread and vital to many ecosystems. Recent work with several nematode species has suggested that strong associations with microbial symbionts may also be common among members of this phylu. In this work we explore possible symbiosis between bacteria and the free living soil bacteriovorous nematode Acrobeloides maximus. Methodology We used a soil microcosm approach to expose A. maximus populations grown monoxenically on RFP labeled Escherichia coli in a soil slurry. Worms were recovered by density gradient separation and examined using both culture-independent and isolation methods. A 16S rRNA gene survey of the worm-associated bacteria was compared to the soil and to a similar analysis using Caenorhabditis elegans N2. Recovered A. maximus populations were maintained on cholesterol agar and sampled to examine the population dynamics of the microbiome. Results A consistent core microbiome was extracted from A. maximus that differed from those in the bulk soil or the C. elegans associated set. Three genera, Ochrobactrum, Pedobacter, and Chitinophaga, were identified at high levels only in the A. maximus populations, which were less diverse than the assemblage associated with C. elegans. Putative symbiont populations were maintained for at least 4 months post inoculation, although the levels decreased as the culture aged. Fluorescence in situ hybridization (FISH) using probes specific for Ochrobactrum and Pedobacter stained bacterial cells in formaldehyde fixed nematode guts. Conclusions Three microorganisms were repeatedly observed in association with Acrobeloides maximus when recovered from soil microcosms. We isolated several Ochrobactrum sp. and Pedobacter sp., and demonstrated that they inhabit the nematode gut by FISH. Although their role in A. maximus is not resolved, we propose possible mutualistic roles for these bacteria in protection of the host against pathogens and facilitating enzymatic digestion of other ingested bacteria. PMID:23894287

Baquiran, Jean-Paul; Thater, Brian; Sedky, Sammy; De Ley, Paul; Crowley, David; Orwin, Paul M.



3D PiC code investigations of Auroral Kilometric Radiation mechanisms  

NASA Astrophysics Data System (ADS)

Efficient (~1%) electron cyclotron radio emissions are known to originate in the X mode from regions of locally depleted plasma in the Earths polar magnetosphere. These emissions are commonly referred to as the Auroral Kilometric Radiation (AKR). AKR occurs naturally in these polar regions where electrons are accelerated by electric fields into the increasing planetary magnetic dipole. Here conservation of the magnetic moment converts axial to rotational momentum forming a horseshoe distribution in velocity phase space. This distribution is unstable to cyclotron emission with radiation emitted in the X-mode. Initial studies were conducted in the form of 2D PiC code simulations [1] and a scaled laboratory experiment that was constructed to reproduce the mechanism of AKR. As studies progressed, 3D PiC code simulations were conducted to enable complete investigation of the complex interaction dimensions. A maximum efficiency of 1.25% is predicted from these simulations in the same mode and frequency as measured in the experiment. This is also consistent with geophysical observations and the predictions of theory.

Gillespie, K. M.; McConville, S. L.; Speirs, D. C.; Ronald, K.; Phelps, A. D. R.; Bingham, R.; Cross, A. W.; Robertson, C. W.; Whyte, C. G.; He, W.; Vorgul, I.; Cairns, R. A.; Kellett, B. J.



Investigation of NOTCH4 coding region polymorphisms in sporadic inclusion body myositis.  


The NOTCH4 gene, located within the MHC region, is involved in cellular differentiation and has varying effects dependent on tissue type. Coding region polymorphisms haplotypic of the sIBM-associated 8.1 ancestral haplotype were identified in NOTCH4 and genotyped in two different Caucasian sIBM cohorts. In both cohorts the frequency of the minor allele of rs422951 and the 12-repeat variation for rs72555375 was increased and was higher than the frequency of the sIBM-associated allele HLA-DRB1*0301. These NOTCH4 polymorphisms can be considered to be markers for sIBM susceptibility, but require further investigation to determine whether they are directly involved in the disease pathogenesis. PMID:22732452

Scott, Adrian P; Laing, Nigel G; Mastaglia, Frank; Dalakas, Marinos; Needham, Merrilee; Allcock, Richard J N



Coding Variants at Hexa-allelic Amino Acid 13 of HLA-DRB1 Explain Independent SNP Associations with Follicular Lymphoma Risk  

PubMed Central

Non-Hodgkin lymphoma represents a diverse group of blood malignancies, of which follicular lymphoma (FL) is a common subtype. Previous genome-wide association studies (GWASs) have identified in the human leukocyte antigen (HLA) class II region multiple independent SNPs that are significantly associated with FL risk. To dissect these signals and determine whether coding variants in HLA genes are responsible for the associations, we conducted imputation, HLA typing, and sequencing in three independent populations for a total of 689 cases and 2,446 controls. We identified a hexa-allelic amino acid polymorphism at position 13 of the HLA-DR beta chain that showed the strongest association with FL within the major histocompatibility complex (MHC) region (multiallelic p = 2.3 × 10?15). Out of six possible amino acids that occurred at that position within the population, we classified two as high risk (Tyr and Phe), two as low risk (Ser and Arg), and two as moderate risk (His and Gly). There was a 4.2-fold difference in risk (95% confidence interval = 2.9–6.1) between subjects carrying two alleles encoding high-risk amino acids and those carrying two alleles encoding low-risk amino acids (p = 1.01 × 10?14). This coding variant might explain the complex SNP associations identified by GWASs and suggests a common HLA-DR antigen-driven mechanism for the pathogenesis of FL and rheumatoid arthritis. PMID:23791106

Foo, Jia Nee; Smedby, Karin E.; Akers, Nicholas K.; Berglund, Mattias; Irwan, Ishak D.; Jia, Xiaoming; Li, Yi; Conde, Lucia; Darabi, Hatef; Bracci, Paige M.; Melbye, Mads; Adami, Hans-Olov; Glimelius, Bengt; Khor, Chiea Chuen; Hjalgrim, Henrik; Padyukov, Leonid; Humphreys, Keith; Enblad, Gunilla; Skibola, Christine F.; de Bakker, Paul I.W.; Liu, Jianjun



The City Coding Project : an investigation into some presumed maxims for residential design in Hong Kong  

E-print Network

Formal expressions of architecture in a city are largely dictated by how the city is 'coded' ... re-coding - is capable of making fundamental changes in building forms that would proliferate across the entire city. Therefore, ...

Wong, Chit Kin Dickson



What shape are the neural response functions underlying opponent coding in face space? A psychophysical investigation.  


Recent evidence has shown that face space represents facial identity information using two-pool opponent coding. Here we ask whether the shape of the monotonic neural response functions underlying such coding is linear (i.e. face space codes all equal-sized physical changes with equal sensitivity) or nonlinear (e.g. face space shows greater coding sensitivity around the average face). Using adaptation aftereffects and pairwise discrimination tasks, our results for face attributes of eye height and mouth height demonstrate linear shape; including for bizarre faces far outside the normal range. We discuss how linear coding explains some results in the previous literature, including failures to find that adaptation enhances face discrimination, and suggest possible reasons why face space can maintain detailed coding of values far outside the normal range. We also discuss specific nonlinear coding models needed to explain other findings, and conclude face space appears to use a mixture of linear and nonlinear representations. PMID:19944116

Susilo, Tirta; McKone, Elinor; Edwards, Mark



Further Investigation of Acoustic Propagation Codes for Three-Dimensional Geometries  

NASA Technical Reports Server (NTRS)

The ability to predict fan noise within complex three-dimensional aircraft engine nacelle geometries is a valuable tool in designing and assessing low-noise concepts. This work begins a systematic study to identify the areas of the design space in which propagation codes of varying fidelity may be used effectively to provide efficient design and assessment. An efficient lower-fidelity code is used in conjunction with two higher-fidelity, more computationally intensive methods to solve benchmark problems of increasing complexity. The codes represent a small sampling of the current propagation codes available or under development. Results of this initial study indicate that the lower-fidelity code provides satisfactory results for cases involving low to moderate attenuation rates, whereas, the two higher-fidelity codes perform well across the range of problems.

Nark, Douglas M.; Watson, Willie R.; Jones, Michael G.



Investigation of independence in inter-animal tumor-type occurrences within the NTP rodent-bioassay database  

SciTech Connect

Statistically significant elevation in tumor incidence at multiple histologically distinct sites is occasionally observed among rodent bioassays of chemically induced carcinogenesis. If such data are to be relied on (as they have, e.g., by the US EPA) for quantitative cancer potency assessment, their proper analysis requires a knowledge of the extent to which multiple tumor-type occurrences are independent or uncorrelated within individual bioassay animals. Although difficult to assess in a statistically rigorous fashion, a few significant associations among tumor-type occurrences in rodent bioassays have been reported. However, no comprehensive studies of animal-specific tumor-type occurrences at death or sacrifice have been conducted using the extensive set of available NTP rodent-bioassay data, on which most cancer-potency assessment for environmental chemicals is currently based. This report presents the results of such an analysis conducted on behalf of the National Research Council`s Committee on Risk Assessment for Hazardous Air Pollutants. Tumor-type associations among individual animals were examined for {approximately}2500 to 3000 control and {approximately}200 to 600 treated animals using pathology data from 62 B6C3F1 mouse studies and 61 F/344N rat studies obtained from a readily available subset of the NTP carcinogenesis bioassay database. No evidence was found for any large correlation in either the onset probability or the prevalence-at-death or sacrifice of any tumor-type pair investigated in control and treated rats and niece, although a few of the small correlations present were statistically significant. Tumor-type occurrences were in most cases nearly independent, and departures from independence, where they did occur, were small. This finding is qualified in that tumor-type onset correlations were measured only indirectly, given the limited nature of the data analyzed.

Bogen, K.T. [Lawrence Livermore National Lab., CA (United States); Seilkop, S. [Analytical Sciences, Inc., Durham, NC (United States)



Investigation of Different Constituent Encoders in a Turbo-code Scheme for Reduced Decoder Complexity  

NASA Technical Reports Server (NTRS)

A large number of papers have been published attempting to give some analytical basis for the performance of Turbo-codes. It has been shown that performance improves with increased interleaver length. Also procedures have been given to pick the best constituent recursive systematic convolutional codes (RSCC's). However testing by computer simulation is still required to verify these results. This thesis begins by describing the encoding and decoding schemes used. Next simulation results on several memory 4 RSCC's are shown. It is found that the best BER performance at low E(sub b)/N(sub o) is not given by the RSCC's that were found using the analytic techniques given so far. Next the results are given from simulations using a smaller memory RSCC for one of the constituent encoders. Significant reduction in decoding complexity is obtained with minimal loss in performance. Simulation results are then given for a rate 1/3 Turbo-code with the result that this code performed as well as a rate 1/2 Turbo-code as measured by the distance from their respective Shannon limits. Finally the results of simulations where an inaccurate noise variance measurement was used are given. From this it was observed that Turbo-decoding is fairly stable with regard to noise variance measurement.

Kwatra, S. C.



Simulative Investigation on Spectral Efficiency of Unipolar Codes based OCDMA System using Importance Sampling Technique  

NASA Astrophysics Data System (ADS)

This paper analyses the spectral efficiency of Optical Code Division Multiple Access (OCDMA) system using Importance Sampling (IS) technique. We consider three configurations of OCDMA system namely Direct Sequence (DS), Spectral Amplitude Coding (SAC) and Fast Frequency Hopping (FFH) that exploits the Fiber Bragg Gratings (FBG) based encoder/decoder. We evaluate the spectral efficiency of the considered system by taking into consideration the effect of different families of unipolar codes for both coherent and incoherent sources. The results show that the spectral efficiency of OCDMA system with coherent source is higher than the incoherent case. We demonstrate also that DS-OCDMA outperforms both others in terms of spectral efficiency in all conditions.

Farhat, A.; Menif, M.; Rezig, H.



An investigation of aerodynamic characteristics of wings having vortex flow using different numerical codes  

NASA Technical Reports Server (NTRS)

Three different numerical codes are employed to determine the aerodynamic characteristics of wings with separation induced vortex flows. Both flat as well as cambered wings of various planform shapes are studied. The effects of wing thickness, fuselage, notch ratio and multiple vortex modeling on aerodynamic performance of the wing are also examined. The theoretically predicted results are compared with experimental results to validate the various computer codes used in this study. An analytical procedure for designing aerodynamically effective leading edge extension (LEE) for a thick delta wing is also presented.

Chaturvedi, S.; Ghaffari, F.




PubMed Central

Scholars have long argued that inmate behaviors stem in part from cultural belief systems that they “import” with them into incarcerative settings. Even so, few empirical assessments have tested this argument directly. Drawing on theoretical accounts of one such set of beliefs—the code of the street—and on importation theory, we hypothesize that individuals who adhere more strongly to the street code will be more likely, once incarcerated, to engage in violent behavior and that this effect will be amplified by such incarceration experiences as disciplinary sanctions and gang involvement, as well as the lack of educational programming, religious programming, and family support. We test these hypotheses using unique data that include measures of the street code belief system and incarceration experiences. The results support the argument that the code of the street belief system affects inmate violence and that the effect is more pronounced among inmates who lack family support, experience disciplinary sanctions, and are gang involved. Implications of these findings are discussed. PMID:24068837




The computer code for investigation of the multipactor discharge in RF cavities  

Microsoft Academic Search

The special code has been developed for numerical simulation of the multipactor discharge during long-time INR activity in the construction of accelerating structures. It simulates the secondary emission electron trajectories at different levels of rf field using a real electromagnetic field distribution (calculated with modern 2D or 3D software) in the cavity that has a complicated boundary shape. Special implementations

L. V. Kravchuk; G. V. Romanov; S. G. Tarasov; V. V. Paramonov



Investigating the impact of the cielo cray XE6 architecture on scientific application codes.  

SciTech Connect

Cielo, a Cray XE6, is the Department of Energy NNSA Advanced Simulation and Computing (ASC) campaign's newest capability machine. Rated at 1.37 PFLOPS, it consists of 8,944 dual-socket oct-core AMD Magny-Cours compute nodes, linked using Cray's Gemini interconnect. Its primary mission objective is to enable a suite of the ASC applications implemented using MPI to scale to tens of thousands of cores. Cielo is an evolutionary improvement to a successful architecture previously available to many of our codes, thus enabling a basis for understanding the capabilities of this new architecture. Using three codes strategically important to the ASC campaign, and supplemented with some micro-benchmarks that expose the fundamental capabilities of the XE6, we report on the performance characteristics and capabilities of Cielo.

Rajan, Mahesh; Barrett, Richard; Pedretti, Kevin Thomas Tauke; Doerfler, Douglas W.; Vaughan, Courtenay Thomas



Preliminary investigation of acoustic bar codes for short-range underwater communications  

NASA Astrophysics Data System (ADS)

In March 2005, underwater acoustic communications experiments were carried out from the DRDC Atlantic research vessel CFAV QUEST. A battery-operated BATS20 transmitter and a broadband barrel-stave flextensional transducer were used to broadcast noise containing acoustic bar code (ABC) information. The ABCs are silent frequency bands of fixed duration that resemble retail bar codes when viewed in a spectrogram. Two sites were selected for the experiments. The first was a shallow-water area west of the Berry Islands in the Bahamas, and the second was a deep-water site south of the Western Bank on the Scotian Shelf. Two receiver systems were deployed; autonomous, variable-buoyancy Stealth Buoys resting on the bottom at the shallow site, and drifting AN/SSQ-53F sonobuoys fitted with GPS at the deep site. Results from these experiments will be presented and future work will be discussed.

Jones, Dennis F.



An Investigation of Two Acoustic Propagation Codes for Three-Dimensional Geometries  

NASA Technical Reports Server (NTRS)

The ability to predict fan noise within complex three-dimensional aircraft engine nacelle geometries is a valuable tool in studying low-noise designs. Recent years have seen the development of aeroacoustic propagation codes using various levels of approximation to obtain such a capability. In light of this, it is beneficial to pursue a design paradigm that incorporates the strengths of the various tools. The development of a quasi-3D methodology (Q3D-FEM) at NASA Langley has brought these ideas to mind in relation to the framework of the CDUCT-LaRC acoustic propagation and radiation tool. As more extensive three dimensional codes become available, it would seem appropriate to incorporate these tools into a framework similar to CDUCT-LaRC and use them in a complementary manner. This work focuses on such an approach in beginning the steps toward a systematic assessment of the errors, and hence the trade-offs, involved in the use of these codes. To illustrate this point, CDUCT-LaRC was used to study benchmark hardwall duct problems to quantify errors caused by wave propagation in directions far removed from that defined by the parabolic approximation. Configurations incorporating acoustic treatment were also studied with CDUCT-LaRC and Q3D-FEM. The cases presented show that acoustic treatment diminishes the effects of CDUCT-LaRC phase error as the solutions are attenuated. The results of the Q3D-FEM were very promising and matched the analytic solution very well. Overall, these tests were meant to serve as a step toward the systematic study of errors inherent in the propagation module of CDUCT-LaRC, as well as an initial test of the higher fidelity Q3D-FEM code.

Nark, D. M.; Watson, W. R.; Jones, M. G.



Versatile code DLAYZ for investigating population kinetics and radiative properties of plasmas in non-local thermodynamic equilibrium  

NASA Astrophysics Data System (ADS)

A versatile code DLAYZ based on collisional-radiative model is developed for investigating the population kinetics and radiative properties of plasmas in non-local thermodynamic equilibrium. DLAYZ is implemented on the detailed level accounting (DLA) approach and can be extended to detailed configuration accounting (DCA) and hybrid DLA/DCA approaches. The code can treat both steady state and time-dependent problems. The implementation of the main modules of DLAYZ is discussed in detail including atomic data, rates, population distributions and radiative properties modules. The complete set of basic atomic data is obtained using relativistic quantum mechanics. For dense plasmas, the basic atomic data with plasma screening effects can be obtained. The populations are obtained by solving the coupled rate equations, which are used to calculate the radiative properties. A parallelized version is implemented in the code to treat the large-scale rate equations. Two illustrative examples of a steady state case for carbon plasmas and a time-dependent case for the relaxation of a K-shell excited argon are employed to show the main features of the present code.

Gao, Cheng; Zeng, Jiaolong; Li, Yongqiang; Jin, Fengtao; Yuan, Jianmin



Safety Related Investigations of the VVER-1000 Reactor Type by the Coupled Code System TRACE/PARCS  

NASA Astrophysics Data System (ADS)

This study was performed at the Institute of Reactor Safety at the Forschungszentrum Karlsruhe. It is embedded in the ongoing investigations of the international code assessment and maintenance program (CAMP) for qualification and validation of system codes like TRACE(1) and PARCS(2). The chosen reactor type used to validate these two codes was the Russian designed VVER-1000 because the OECD/NEA VVER-1000 Coolant Transient Benchmark Phase 2(3) includes detailed information of the Bulgarian nuclear power plant (NPP) Kozloduy unit 6. The post-test investigations of a coolant mixing experiment have shown that the predicted parameters (coolant temperature, pressure drop, etc.) are in good agreement with the measured data. The coolant mixing pattern, especially in the downcomer, has been also reproduced quiet well by TRACE. The coupled code system TRACE/PARCS which was applied on a postulated main steam line break (MSLB) provided good results compared to reference values and the ones of other participants of the benchmark. The results show that the developed three-dimensional nodalization of the reactor pressure vessel (RPV) is appropriate to describe the coolant mixing phenomena in the downcomer and the lower plenum of a VVER-1000 reactor. This phenomenon is a key issue for investigations of MSLB transient where the thermal hydraulics and the core neutronics are strongly linked. The simulation of the RPV and core behavior for postulated transients using the validated 3D TRACE RPV model, taking into account boundary conditions at vessel in- and outlet, indicates that the results are physically sound and in good agreement to other participant's results.

Jaeger, Wadim; Espinoza, Victor Hugo Sánchez; Lischke, Wolfgang


Bacterial toxin RelE mediates frequent codon-independent mRNA cleavage from the 5' end of coding regions in vivo.  


The enzymatic activity of the RelE bacterial toxin component of the Escherichia coli RelBE toxin-antitoxin system has been extensively studied in vitro and to a lesser extent in vivo. These earlier reports revealed that 1) RelE alone does not exhibit mRNA cleavage activity, 2) RelE mediates mRNA cleavage through its association with the ribosome, 3) RelE-mediated mRNA cleavage occurs at the ribosomal A site and, 4) Cleavage of mRNA by RelE exhibits high codon specificity. More specifically, RelE exhibits a preference for the stop codons UAG and UGA and sense codons CAG and UCG in vitro. In this study, we used a comprehensive primer extension approach to map the frequency and codon specificity of RelE cleavage activity in vivo. We found extensive cleavage at the beginning of the coding region of five transcripts, ompA, lpp, ompF, rpsA, and tufA. We then mapped RelE cleavage sites across one short transcript (lpp) and two long transcripts (ompF and ompA). RelE cut all of these transcripts frequently and efficiently within the first ?100 codons, only occasionally cut beyond this point, and rarely cut at sites in proximity to the 3' end. Among 196 RelE sites in these five transcripts, there was no preference for CAG or UCG sense codons. In fact, bioinformatic analysis of the RelE cleavage sites failed to identify any sequence preferences. These results suggest a model of RelE function distinct from those proposed previously, because RelE directed frequent codon-independent mRNA cleavage coincident with the commencement of translation elongation. PMID:21324908

Hurley, Jennifer M; Cruz, Jonathan W; Ouyang, Ming; Woychik, Nancy A



Investigation of the Fission Product Release From Molten Pools Under Oxidizing Conditions With the Code RELOS  

SciTech Connect

With the purpose of modeling and calculating the core behavior during severe accidents in nuclear power plants system codes are under development worldwide. Modeling of radionuclide release and transport in the case of beyond design basis accidents is an integrated feature of the deterministic safety analysis of nuclear power plants. Following a hypothetical, uncontrolled temperature escalation in the core of light water reactors, significant parts of the core structures may degrade and melt down under formation of molten pools, leading to an accumulation of large amounts of radioactive materials. The possible release of radionuclides from the molten pool provides a potential contribution to the aerosol source term in the late phase of core degradation accidents. The relevance of the amount of transferred oxygen from the gas atmosphere into the molten pool on the specification of a radionuclide and its release depends strongly on the initial oxygen inventory. Particularly for a low oxygen potential in the melt as it is the case for stratification when a metallic phase forms the upper layer and, respectively, when the oxidation has proceeded so far so that zirconium was completely oxidized, a significant influence of atmospheric oxygen on the specification and the release of some radionuclides has to be anticipated. The code RELOS (Release of Low Volatile Fission Products from Molten Surfaces) is under development at the Department of Energy Systems and Energy Economics (formerly Department of Nuclear and New Energy Systems) of the Ruhr-University Bochum. It is based on a mechanistic model to describe the diffusive and convective transport of fission products from the surface of a molten pool into a cooler gas atmosphere. This paper presents the code RELOS, i. e. the features and abilities of the latest code version V2.3 and the new model improvements of V2.4 and the calculated results evaluating the implemented models which deal with the oxygen transfer from the liquid side of the phase boundary to the bulk of the melt by diffusion or by taking into account natural convection. Both models help to estimate the amount of oxygen entering into the liquid upper pool volume and being available for the oxidation reaction. For both models the metallic, the oxidic and a mixture phase can be taken into account when defining the composition of the upper pool volume. The influence of crust formation, i. e. the decrease of the liquid pool surface area is taken care of because it yields the relevant amount of fission products released into the atmosphere. The difference of the partial density between the gas side of the phase boundary and the bulk of the gas phase is the driving force of mass transport. (authors)

Kleinhietpass, Ingo D.; Unger, Hermann; Wagner, Hermann-Josef; Koch, Marco K. [Ruhr-University Bochum, Postfach 10 21 48, 44721 Bochum, (Germany)



Investigating the Relationship between Field Independence-Dependence and Conditional Reasoning Performance within an Effective Instructional System.  

ERIC Educational Resources Information Center

College undergraduates (n=44) were given a measure of field independence-dependence prior to receiving instruction in conditional reasoning. Instruction incorporated various aspects of an earlier effective instructional system that was adapted for present learners and was incorporated into a larger, more typical classroom group. Results indicated…

Lane, David S., Jr.; Newman, Dianna L.


Flight investigation of cockpit-displayed traffic information utilizing coded symbology in an advanced operational environment  

NASA Technical Reports Server (NTRS)

Traffic symbology was encoded to provide additional information concerning the traffic, which was displayed on the pilot's electronic horizontal situation indicators (EHSI). A research airplane representing an advanced operational environment was used to assess the benefit of coded traffic symbology in a realistic work-load environment. Traffic scenarios, involving both conflict-free and conflict situations, were employed. Subjective pilot commentary was obtained through the use of a questionnaire and extensive pilot debriefings. These results grouped conveniently under two categories: display factors and task performance. A major item under the display factor category was the problem of display clutter. The primary contributors to clutter were the use of large map-scale factors, the use of traffic data blocks, and the presentation of more than a few airplanes. In terms of task performance, the cockpit-displayed traffic information was found to provide excellent overall situation awareness. Additionally, mile separation prescribed during these tests.

Abbott, T. S.; Moen, G. C.; Person, L. H., Jr.; Keyser, G. L., Jr.; Yenni, K. R.; Garren, J. F., Jr.



National evaluation of the benefits and risks of greater structuring and coding of the electronic health record: exploratory qualitative investigation  

PubMed Central

Objective We aimed to explore stakeholder views, attitudes, needs, and expectations regarding likely benefits and risks resulting from increased structuring and coding of clinical information within electronic health records (EHRs). Materials and methods Qualitative investigation in primary and secondary care and research settings throughout the UK. Data were derived from interviews, expert discussion groups, observations, and relevant documents. Participants (n=70) included patients, healthcare professionals, health service commissioners, policy makers, managers, administrators, systems developers, researchers, and academics. Results Four main themes arose from our data: variations in documentation practice; patient care benefits; secondary uses of information; and informing and involving patients. We observed a lack of guidelines, co-ordination, and dissemination of best practice relating to the design and use of information structures. While we identified immediate benefits for direct care and secondary analysis, many healthcare professionals did not see the relevance of structured and/or coded data to clinical practice. The potential for structured information to increase patient understanding of their diagnosis and treatment contrasted with concerns regarding the appropriateness of coded information for patients. Conclusions The design and development of EHRs requires the capture of narrative information to reflect patient/clinician communication and computable data for administration and research purposes. Increased structuring and/or coding of EHRs therefore offers both benefits and risks. Documentation standards within clinical guidelines are likely to encourage comprehensive, accurate processing of data. As data structures may impact upon clinician/patient interactions, new models of documentation may be necessary if EHRs are to be read and authored by patients. PMID:24186957

Morrison, Zoe; Fernando, Bernard; Kalra, Dipak; Cresswell, Kathrin; Sheikh, Aziz



The second deficit: An investigation of the independence of phonological and naming-speed deficits in developmental dyslexia  

Microsoft Academic Search

An increasing body of dyslexia researchdemonstrates, in addition to phonologicaldeficits, a second core deficit in theprocesses underlying naming speed. Thehypothesized independence of phonologicalawareness and naming-speed variables inpredicting variance in three aspects of readingperformance was studied in a group of 144severely-impaired readers in Grades 2 and 3. Stepwise regression analyses were conducted onthese variables, controlling for the effects ofSES, age, and

Maryanne Wolf; Alyssa Goldberg O'Rourke; Calvin Gidney; Maureen Lovett; Paul Cirino; Robin Morris



Investigation into the flow field around a maneuvering submarine using a Reynolds-averaged Navier-Stokes code  

NASA Astrophysics Data System (ADS)

The accurate and efficient prediction of hydrodynamic forces and moments on a maneuvering submarine has been achieved by investigating the flow physics involving the interaction of the vortical flow shed from the sail and the cross-flow boundary layer of the hull. In this investigation, a Reynolds-Averaged Navier-Stokes (RANS) computer code is used to simulate the most important physical effects related to maneuvering. It is applied to a generic axisymmetric body with the relatively simple case of the flow around an unappended hull at an angle of attack. After the code is validated for this simple case, it is validated for the case of a submarine with various appendages attached to the hull moving at an angle of drift. All six components of predicted forces and moments for various drift angles are compared with experimental data. Calculated pressure coefficients along the azimuthal angle are compared with experimental data and discussed to show the effect of the sail and the stern appendages. To understand the main flow features for a submarine in a straight flight, the RANS code is applied to simulate SUBOFF axisymmetric body at zero angle of attack in a straight-line basin. Pressure coefficient, skin friction coefficient, mean velocity components and the Reynolds shear stresses are compared with experimental data and discussed. The physical aspects of the interaction between the vortical flow shed by the sail and the cross-flow boundary layer on the hull are explained in greater detail. The understanding of this interaction is very important to characterize accurately the hydrodynamic behavior of a maneuvering submarine.

Rhee, Bong


Investigation of wellbore cooling by circulation and fluid penetration into the formation using a wellbore thermal simulator computer code  

SciTech Connect

The high temperatures of geothermal wells present severe problems for drilling, logging, and developing these reservoirs. Cooling the wellbore is perhaps the most common method to solve these problems. However, it is usually not clear what may be the most effective wellbore cooling mechanism for a given well. In this paper, wellbore cooling by the use of circulation or by fluid injection into the surrounding rock is investigated using a wellbore thermal simulator computer code. Short circulation times offer no prolonged cooling of fluid in the wellbore, but long circulation times (greater than ten or twenty days) greatly reduce the warming rate after shut-in. The dependence of the warming rate on the penetration distance of cooler temperatures into the rock formation (as by fluid injection) is investigated. Penetration distances of greater than 0.6 m appear to offer a substantial reduction in the warming rate. Several plots are shown which demonstrate these effects. 16 refs., 6 figs.

Duda, L.E.



Polyphasic Study of the Spatial Distribution of Microorganisms in Mexican Pozol, a Fermented Maize Dough, Demonstrates the Need for Cultivation-Independent Methods To Investigate Traditional Fermentations  

Microsoft Academic Search

The distribution of microorganisms in pozol balls, a fermented maize dough, was investigated by a polypha- sic approach in which we used both culture-dependent and culture-independent methods, including microbial enumeration, fermentation product analysis, quantification of microbial taxa with 16S rRNA-targeted oligo- nucleotide probes, determination of microbial fingerprints by denaturing gradient gel electrophoresis (DGGE), and 16S ribosomal DNA gene sequencing. Our




Investigation of Nuclear Data Libraries with TRIPOLI-4 Monte Carlo Code for Sodium-cooled Fast Reactors  

NASA Astrophysics Data System (ADS)

The Sodium-cooled fast neutron reactor ASTRID is currently under design and development in France. Traditional ECCO/ERANOS fast reactor code system used for ASTRID core design calculations relies on multi-group JEFF-3.1.1 data library. To gauge the use of ENDF/B-VII.0 and JEFF-3.1.1 nuclear data libraries in the fast reactor applications, two recent OECD/NEA computational benchmarks specified by Argonne National Laboratory were calculated. Using the continuous-energy TRIPOLI-4 Monte Carlo transport code, both ABR-1000 MWth MOX core and metallic (U-Pu) core were investigated. Under two different fast neutron spectra and two data libraries, ENDF/B-VII.0 and JEFF-3.1.1, reactivity impact studies were performed. Using JEFF-3.1.1 library under the BOEC (Beginning of equilibrium cycle) condition, high reactivity effects of 808 ± 17 pcm and 1208 ± 17 pcm were observed for ABR-1000 MOX core and metallic core respectively. To analyze the causes of these differences in reactivity, several TRIPOLI-4 runs using mixed data libraries feature allow us to identify the nuclides and the nuclear data accounting for the major part of the observed reactivity discrepancies.

Lee, Y.-K.; Brun, E.



Experimental investigation on security of temporal phase coding OCDMA system with code-shift keying and differential phase-shift keying  

Microsoft Academic Search

We experimentally demonstrate the security vulnerability in the temporal phase coding single-user differential phase-shift keying (DPSK) and code-shift keying (CSK) OCDMA systems with a DPSK demodulator. In the experiment, we build up the 2.5Gbit\\/s DPSK- and CSK-OCDMA systems. In the systems, we use two 31-chip 640 Gchip\\/s superstructured fiber Bragg grating encoders for the signal encoding. In the receiving side,

Bo Dai; Zhensen Gao; Xu Wang; Nobuyuki Kataoka; Naoya Wada



Manipulation of independent synthesis and degradation of polyphosphate in Escherichia coli for investigation of phosphate secretion from the cell.  

PubMed Central

The genes involved in polyphosphate metabolism in Escherichia coli were cloned behind different inducible promoters on separate plasmids. The gene coding for polyphosphate kinase (PPK), the enzyme responsible for polyphosphate synthesis, was placed behind the Ptac promoter. Polyphosphatase, a polyphosphate depolymerase, was similarly expressed by using the arabinose-inducible PBAD promoter. The ability of cells containing these constructs to produce active enzymes only when induced was confirmed by polyphosphate extraction, enzyme assays, and RNA analysis. The inducer concentrations giving optimal expression of each enzyme were determined. Experiments were performed in which ppk was induced early in growth, overproducing PPK and allowing large amounts of polyphosphate to accumulate (80 mumol in phosphate monomer units per g of dry cell weight). The ppx gene was subsequently induced, and polyphosphate was degraded to inorganic phosphate. Approximately half of this polyphosphate was depleted in 210 min. The phosphate released from polyphosphate allowed the growth of phosphate-starved cells and was secreted into the medium, leading to a down-regulation of the phosphate-starvation response. In addition, the steady-state polyphosphate level was precisely controlled by manipulating the degree of ppx induction. The polyphosphate content varied from 98 to 12 mumol in phosphate monomer units per g of dry cell weight as the arabinose concentration was increased from 0 to 0.02% by weight. PMID:9143103

Van Dien, S J; Keyhani, S; Yang, C; Keasling, J D



Utilization of a Photon Transport Code to Investigate Radiation Therapy Treatment Planning Quantities and Techniques.  

NASA Astrophysics Data System (ADS)

A versatile computer program MORSE, based on neutron and photon transport theory has been utilized to investigate radiation therapy treatment planning quantities and techniques. A multi-energy group representation of transport equation provides a concise approach in utilizing Monte Carlo numerical techniques to multiple radiation therapy treatment planning problems. A general three dimensional geometry is used to simulate radiation therapy treatment planning problems in configurations of an actual clinical setting. Central axis total and scattered dose distributions for homogeneous and inhomogeneous water phantoms are calculated and the correction factor for lung and bone inhomogeneities are also evaluated. Results show that Monte Carlo calculations based on multi-energy group transport theory predict the depth dose distributions that are in good agreement with available experimental data. Improved correction factors based on the concepts of lung-air-ratio and bone-air-ratio are proposed in lieu of the presently used correction factors that are based on tissue-air-ratio power law method for inhomogeneity corrections. Central axis depth dose distributions for a bremsstrahlung spectrum from a linear accelerator is also calculated to exhibit the versatility of the computer program in handling multiple radiation therapy problems. A novel approach is undertaken to study the dosimetric properties of brachytherapy sources. Dose rate constants for various radionuclides are calculated from the numerically generated dose rate versus source energy curves. Dose rates can also be generated for any point brachytherapy source with any arbitrary energy spectrum at various radial distances from this family of curves.

Palta, Jatinder Raj


Error-correction coding  

NASA Technical Reports Server (NTRS)

This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

Hinds, Erold W. (Principal Investigator)



Independent Technical Investigation of the Puna Geothermal Venture Unplanned Steam Release, June 12 and 13, 1991, Puna, Hawaii  

SciTech Connect

On June 24, 1991, a third-party investigation team consisting of Richard P. Thomas, Duey E. Milner, James L. Moore, and Dick Whiting began an investigation into the blowout of well KS-8, which occurred at the Puna Geothermal Venture (PGV) site on June 12, 1991, and caused the unabated release of steam for a period of 31 hours before PGV succeeded in closing in the well. The scope of the investigation was to: (a) determine the cause(s) of the incident; (b) evaluate the adequacy of PGVs drilling and blowout prevention equipment and procedures; and (c) make recommendations for any appropriate changes in equipment and/or procedures. This report finds that the blowout occurred because of inadequacies in PGVs drilling plan and procedures and not as a result of unusual or unmanageable subsurface geologic or hydrologic conditions. While the geothermal resource in the area being drilled is relatively hot, the temperatures are not excessive for modem technology and methods to control. Fluid pressures encountered are also manageable if proper procedures are followed and the appropriate equipment is utilized. A previous blowout of short duration occurred on February 21, 1991, at the KS-7 injection well being drilled by PGV at a depth of approximately 1600'. This unexpected incident alerted PGV to the possibility of encountering a high temperature, fractured zone at a relatively shallow depth. The experience at KS-7 prompted PGV to refine its hydrological model; however, the drilling plan utilized for KS-8 was not changed. Not only did PGV fail to modify its drilling program following the KS-7 blowout, but they also failed to heed numerous ''red flags'' (warning signals) in the five days preceding the KS-8 blowout, which included a continuous 1-inch flow of drilling mud out of the wellbore, gains in mud volume while pulling stands, and gas entries while circulating muds bottoms up, in addition to lost circulation that had occurred earlier below the shoe of the 13-3/8-hch casing.

Thomas, Richard; Whiting, Dick; Moore, James; Milner, Duey



Experimental investigation on security of temporal phase coding OCDMA system with code-shift keying and differential phase-shift keying  

Microsoft Academic Search

We experimentally demonstrate security vulnerability in the temporal phase coding single-user DPSK- and CSK-OCDMA systems with a DPSK demodulator. In the systems without proper decoding, error free BER performance and clear open eye diagrams indicate the eavesdropping possibility for both systems. We also discuss the principle of DPSK demodulation attack.

Bo Dai; Zhensen Gao; Xu Wang; Nobuyuki Kataoka; Naoya Wada



Experimental investigation on security of temporal phase coding OCDMA system with code-shift keying and differential phase-shift keying  

NASA Astrophysics Data System (ADS)

We experimentally demonstrate the security vulnerability in the temporal phase coding single-user differential phase-shift keying (DPSK) and code-shift keying (CSK) OCDMA systems with a DPSK demodulator. In the experiment, we build up the 2.5Gbit/s DPSK- and CSK-OCDMA systems. In the systems, we use two 31-chip 640 Gchip/s superstructured fiber Bragg grating encoders for the signal encoding. In the receiving side, we remove the decoders and utilize the DPSK demodulator to detect the encoded signals directly. We successfully achieve the error-free BER performance and obtaine the clear open eye diagrams using the detection without the proper decoding. It indicates the existence of the eavesdropping vulnerability in the both systems. Furthermore, we also discuss the principle of DPSK demodulation attack.

Dai, Bo; Gao, Zhensen; Wang, Xu; Kataoka, Nobuyuki; Wada, Naoya



Investigating mitochondrial metabolism in contracting HL-1 cardiomyocytes following hypoxia and pharmacological HIF activation identifies HIF-dependent and independent mechanisms of regulation.  


Hypoxia is a consequence of cardiac disease and downregulates mitochondrial metabolism, yet the molecular mechanisms through which this occurs in the heart are incompletely characterized. Therefore, we aimed to use a contracting HL-1 cardiomyocyte model to investigate the effects of hypoxia on mitochondrial metabolism. Cells were exposed to hypoxia (2% O2) for 6, 12, 24, and 48 hours to characterize the metabolic response. Cells were subsequently treated with the hypoxia inducible factor (HIF)-activating compound, dimethyloxalylglycine (DMOG), to determine whether hypoxia-induced mitochondrial changes were HIF dependent or independent, and to assess the suitability of this cultured cardiac cell line for cardiovascular pharmacological studies. Hypoxic cells had increased glycolysis after 24 hours, with glucose transporter 1 and lactate levels increased 5-fold and 15-fold, respectively. After 24 hours of hypoxia, mitochondrial networks were more fragmented but there was no change in citrate synthase activity, indicating that mitochondrial content was unchanged. Cellular oxygen consumption was 30% lower, accompanied by decreases in the enzymatic activities of electron transport chain (ETC) complexes I and IV, and aconitase by 81%, 96%, and 72%, relative to controls. Pharmacological HIF activation with DMOG decreased cellular oxygen consumption by 43%, coincident with decreases in the activities of aconitase and complex I by 26% and 30%, indicating that these adaptations were HIF mediated. In contrast, the hypoxia-mediated decrease in complex IV activity was not replicated by DMOG treatment, suggesting HIF-independent regulation of this complex. In conclusion, 24 hours of hypoxia increased anaerobic glycolysis and decreased mitochondrial respiration, which was associated with changes in ETC and tricarboxylic acid cycle enzyme activities in contracting HL-1 cells. Pharmacological HIF activation in this cardiac cell line allowed both HIF-dependent and independent mitochondrial metabolic changes to be identified. PMID:24607765

Ambrose, Lucy J A; Abd-Jamil, Amira H; Gomes, Renata S M; Carter, Emma E; Carr, Carolyn A; Clarke, Kieran; Heather, Lisa C



Polyphasic Study of the Spatial Distribution of Microorganisms in Mexican Pozol, a Fermented Maize Dough, Demonstrates the Need for Cultivation-Independent Methods To Investigate Traditional Fermentations  

PubMed Central

The distribution of microorganisms in pozol balls, a fermented maize dough, was investigated by a polyphasic approach in which we used both culture-dependent and culture-independent methods, including microbial enumeration, fermentation product analysis, quantification of microbial taxa with 16S rRNA-targeted oligonucleotide probes, determination of microbial fingerprints by denaturing gradient gel electrophoresis (DGGE), and 16S ribosomal DNA gene sequencing. Our results demonstrate that DGGE fingerprinting and rRNA quantification should allow workers to precisely and rapidly characterize the microbial assemblage in a spontaneous lactic acid fermented food. Lactic acid bacteria (LAB) accounted for 90 to 97% of the total active microflora; no streptococci were isolated, although members of the genus Streptococcus accounted for 25 to 50% of the microflora. Lactobacillus plantarum and Lactobacillus fermentum, together with members of the genera Leuconostoc and Weissella, were the other dominant organisms. The overall activity was more important at the periphery of a ball, where eucaryotes, enterobacteria, and bacterial exopolysacharide producers developed. Our results also showed that the metabolism of heterofermentative LAB was influenced in situ by the distribution of the LAB in the pozol ball, whereas homolactic fermentation was controlled primarily by sugar limitation. We propose that starch is first degraded by amylases from LAB and that the resulting sugars, together with the lactate produced, allow a secondary flora to develop in the presence of oxygen. Our results strongly suggest that cultivation-independent methods should be used to study traditional fermented foods. PMID:10584005

Ampe, Frédéric; ben Omar, Nabil; Moizan, Claire; Wacher, Carmen; Guyot, Jean-Pierre



Validation and application of the WABE code: Investigations of constitutive laws and 2D effects on debris coolability  

Microsoft Academic Search

The WABE-2D model aims at the problem of coolability of degraded core material during a severe accident in a light water reactor (LWR) and describes the transient boil-off and quenching behavior of debris beds. It is being developed in the frame of the KESS code system, which is considered to describe the processes of core heatup, melting, degradation and relocation

Manfred Bürger; Michael Buck; Werner Schmidt; Walter Widmann



Supporting the cybercrime investigation process: Effective discrimination of source code authors based on byte-level information  

Microsoft Academic Search

Source code authorship analysis is the particular field that attempts to identify the author of a computer program by treating each program as a linguistically analyzable entity. This is usually based on other undisputed program samples from the same author. There are several cases where the application of such a method could be of a major benefit, such as tracing

Georgia Frantzeskou; Efstathios Stamatatos; Stefanos Gritzalis



Supporting the Cybercrime Investigation Process: Effective Discrimination of Source Code Authors Based on Byte-Level Information  

NASA Astrophysics Data System (ADS)

Source code authorship analysis is the particular field that attempts to identify the author of a computer program by treating each program as a linguistically analyzable entity. This is usually based on other undisputed program samples from the same author. There are several cases where the application of such a method could be of a major benefit, such as tracing the source of code left in the system after a cyber attack, authorship disputes, proof of authorship in court, etc. In this paper, we present our approach which is based on byte-level n-gram profiles and is an extension of a method that has been successfully applied to natural language text authorship attribution. We propose a simplified profile and a new similarity measure which is less complicated than the algorithm followed in text authorship attribution and it seems more suitable for source code identification since is better able to deal with very small training sets. Experiments were performed on two different data sets, one with programs written in C++ and the second with programs written in Java. Unlike the traditional language-dependent metrics used by previous studies, our approach can be applied to any programming language with no additional cost. The presented accuracy rates are much better than the best reported results for the same data sets.

Frantzeskou, Georgia; Stamatatos, Efstathios; Gritzalis, Stefanos



Microsoft Academic Search

We organised a two-day workshop to promote our multi-expertise collaborative project on Chiral Symmetry Breaking in the strong interactions. Including the six investigators in our group, there were twenty-eight participants. Twelve talks were presented, reporting on novel insights and methods: AdS\\/CFT duality of large-Nc QCD to a gravity on AdS5, lattice-QCD and Dyson-Schwinger approaches. The possibility of a separation between

Craig Roberts; Donald Sinclair; Jeff Harvey; David Kutasov; Sophia Domokos; Carlos Wagner



The Utility of CBM Written Language Indices: An Investigation of Production-Dependent, Production-Independent, and Accurate-Production Scores  

ERIC Educational Resources Information Center

This study examined the utility of three categories of CBM written language indices including production-dependent indices (Total Words Written, Words Spelled Correctly, and Correct Writing Sequences), production-independent indices (Percentage of Words Spelled Correctly and Percentage of Correct Writing Sequences), and an accurate-production…

Jewell, Jennifer; Malecki, Christine Kerres



fMRI Investigation of Working Memory for Faces in Autism: Visual Coding and Underconnectivity with Frontal Areas  

Microsoft Academic Search

Brain activation and functional connectivity were investigated in high functioning autism using functional magnetic resonance imaging in an n-back working memory task involving photographic face stimuli. The autism group showed reliably lower activation compared with controls in the inferior left prefrontal area (involved in verbal processing and working memory maintenance) and the right posterior temporal area (associated with theory of

Hideya Koshino; Rajesh K. Kana; Timothy A. Keller; Nancy J. Minshew; Marcel Adam Just


Residential wire codes: reproducibility and relation with measured magnetic fields  

PubMed Central

OBJECTIVES: To investigate the reproducibility of wire codes to characterise residential power line configurations and to determine the extent to which wire codes provide a proxy measure of residential magnetic field strength in a case-control study of childhood leukaemia conducted in nine states within the United States. METHODS: Misclassification of wire codes was assessed with independent measurements by two technicians for 187 residences. The association between categories of wire code and measured level of magnetic field was evaluated in 858 residences with both a wire code measurement and a 24 hour measurement of the magnetic field in the bedroom. The strength of the association between category of wire code and risk of leukaemia was examined in two regions with different average levels of magnetic field in homes with high categories of wire code. RESULTS: The reproducibility of any of three different classifications of wire codes was excellent (kappa > or = 0.89). Mean and median magnetic fields, and the percentage of homes with high magnetic fields increased with increasing category for each of the wire code classification schemes. The size of the odds ratios for risk of leukaemia and high categories of wire code did not reflect the mean levels of the magnetic field in those categories in two study regions. CONCLUSION: Misclassification of categories of wire code is not a major source of bias in the study. Wire codes provide a proxy measure of exposure to residential magnetic fields. If magnetic fields were a risk factor for leukaemia, however, there would be some attenuation of risk estimates based on wire codes because of misclassification of exposure to magnetic fields at both extremes of the wire code range. The lack of an association between high categories of wire code and risk of leukaemia cannot be explained by a failure of the wire code classification schemes to estimate exposure to magnetic fields in the study area.   PMID:9764111

Tarone, R. E.; Kaune, W. T.; Linet, M. S.; Hatch, E. E.; Kleinerman, R. A.; Robison, L. L.; Boice, J. D.; Wacholder, S.



Tokamak Systems Code  

SciTech Connect

The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.



MORSE Monte Carlo code  

SciTech Connect

The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.

Cramer, S.N.



Codes of ethics  

Microsoft Academic Search

Partly as a result of much recent evidence of business and government crime, a large proportion of major corporations have adopted codes of ethics; government service is also making more use of them. The electrical manufacturing anti-trust conspiracy and 1973–1976 investigation of foreign and domestic bribery were immediate prods. There are also government codes of which the ASPA code is

George C. S. Benson



System and method for investigating sub-surface features of a rock formation with acoustic sources generating coded signals  


A system and a method for investigating rock formations includes generating, by a first acoustic source, a first acoustic signal comprising a first plurality of pulses, each pulse including a first modulated signal at a central frequency; and generating, by a second acoustic source, a second acoustic signal comprising a second plurality of pulses. A receiver arranged within the borehole receives a detected signal including a signal being generated by a non-linear mixing process from the first-and-second acoustic signal in a non-linear mixing zone within the intersection volume. The method also includes-processing the received signal to extract the signal generated by the non-linear mixing process over noise or over signals generated by a linear interaction process, or both.

Vu, Cung Khac; Nihei, Kurt; Johnson, Paul A; Guyer, Robert; Ten Cate, James A; Le Bas, Pierre-Yves; Larmat, Carene S



Investigating the role of rare coding variability in Mendelian dementia genes (APP, PSEN1, PSEN2, GRN, MAPT, and PRNP) in late-onset Alzheimer's disease.  


The overlapping clinical and neuropathologic features between late-onset apparently sporadic Alzheimer's disease (LOAD), familial Alzheimer's disease (FAD), and other neurodegenerative dementias (frontotemporal dementia, corticobasal degeneration, progressive supranuclear palsy, and Creutzfeldt-Jakob disease) raise the question of whether shared genetic risk factors may explain the similar phenotype among these disparate disorders. To investigate this intriguing hypothesis, we analyzed rare coding variability in 6 Mendelian dementia genes (APP, PSEN1, PSEN2, GRN, MAPT, and PRNP), in 141 LOAD patients and 179 elderly controls, neuropathologically proven, from the UK. In our cohort, 14 LOAD cases (10%) and 11 controls (6%) carry at least 1 rare variant in the genes studied. We report a novel variant in PSEN1 (p.I168T) and a rare variant in PSEN2 (p.A237V), absent in controls and both likely pathogenic. Our findings support previous studies, suggesting that (1) rare coding variability in PSEN1 and PSEN2 may influence the susceptibility for LOAD and (2) GRN, MAPT, and PRNP are not major contributors to LOAD. Thus, genetic screening is pivotal for the clinical differential diagnosis of these neurodegenerative dementias. PMID:25104557

Sassi, Celeste; Guerreiro, Rita; Gibbs, Raphael; Ding, Jinhui; Lupton, Michelle K; Troakes, Claire; Al-Sarraj, Safa; Niblock, Michael; Gallo, Jean-Marc; Adnan, Jihad; Killick, Richard; Brown, Kristelle S; Medway, Christopher; Lord, Jenny; Turton, James; Bras, Jose; Morgan, Kevin; Powell, John F; Singleton, Andrew; Hardy, John



Material-dependent and material-independent selection processes in the frontal and parietal lobes: an event-related fMRI investigation of response competition  

NASA Technical Reports Server (NTRS)

The present study used the flanker task [Percept. Psychophys. 16 (1974) 143] to identify neural structures that support response selection processes, and to determine which of these structures respond differently depending on the type of stimulus material associated with the response. Participants performed two versions of the flanker task while undergoing event-related functional magnetic resonance imaging (fMRI). Both versions of the task required participants to respond to a central stimulus regardless of the responses associated with simultaneously presented flanking stimuli, but one used colored circle stimuli and the other used letter stimuli. Competition-related activation was identified by comparing Incongruent trials, in which the flanker stimuli indicated a different response than the central stimulus, to Neutral stimuli, in which the flanker stimuli indicated no response. A region within the right inferior frontal gyrus exhibited significantly more competition-related activation for the color stimuli, whereas regions within the middle frontal gyri of both hemispheres exhibited more competition-related activation for the letter stimuli. The border of the right middle frontal and inferior frontal gyri and the anterior cingulate cortex (ACC) were significantly activated by competition for both types of stimulus materials. Posterior foci demonstrated a similar pattern: left inferior parietal cortex showed greater competition-related activation for the letters, whereas right parietal cortex was significantly activated by competition for both materials. These findings indicate that the resolution of response competition invokes both material-dependent and material-independent processes.

Hazeltine, Eliot; Bunge, Silvia A.; Scanlon, Michael D.; Gabrieli, John D E.



Is ADHD a Risk Factor Independent of Conduct Disorder for Illicit Substance Use? A Meta-Analysis and Meta-Regression Investigation  

ERIC Educational Resources Information Center

Objective: To investigate meta-analytically if the association between ADHD and illicit substance use (ISU) is maintained when controlling for conduct disorder/oppositional-defiant disorder (CD/ODD). Method: A systematic literature review was conducted through Medline from 1980 to 2008. Data extracted and selections made by one author were…

Serra-Pinheiro, Maria Antonia; Coutinho, Evandro S. F.; Souza, Isabella S.; Pinna, Camilla; Fortes, Didia; Araujo, Catia; Szobot, Claudia M.; Rohde, Luis A.; Mattos, Paulo



Statistical mediation analysis with a multicategorical independent variable.  


Virtually all discussions and applications of statistical mediation analysis have been based on the condition that the independent variable is dichotomous or continuous, even though investigators frequently are interested in testing mediation hypotheses involving a multicategorical independent variable (such as two or more experimental conditions relative to a control group). We provide a tutorial illustrating an approach to estimation of and inference about direct, indirect, and total effects in statistical mediation analysis with a multicategorical independent variable. The approach is mathematically equivalent to analysis of (co)variance and reproduces the observed and adjusted group means while also generating effects having simple interpretations. Supplementary material available online includes extensions to this approach and Mplus, SPSS, and SAS code that implements it. PMID:24188158

Hayes, Andrew F; Preacher, Kristopher J



Laplacian sparse coding, Hypergraph Laplacian sparse coding, and applications.  


Sparse coding exhibits good performance in many computer vision applications. However, due to the overcomplete codebook and the independent coding process, the locality and the similarity among the instances to be encoded are lost. To preserve such locality and similarity information, we propose a Laplacian sparse coding (LSc) framework. By incorporating the similarity preserving term into the objective of sparse coding, our proposed Laplacian sparse coding can alleviate the instability of sparse codes. Furthermore, we propose a Hypergraph Laplacian sparse coding (HLSc), which extends our Laplacian sparse coding to the case where the similarity among the instances defined by a hypergraph. Specifically, this HLSc captures the similarity among the instances within the same hyperedge simultaneously, and also makes the sparse codes of them be similar to each other. Both Laplacian sparse coding and Hypergraph Laplacian sparse coding enhance the robustness of sparse coding. We apply the Laplacian sparse coding to feature quantization in Bag-of-Words image representation, and it outperforms sparse coding and achieves good performance in solving the image classification problem. The Hypergraph Laplacian sparse coding is also successfully used to solve the semi-auto image tagging problem. The good performance of these applications demonstrates the effectiveness of our proposed formulations in locality and similarity preservation. PMID:22392702

Gao, Shenghua; Tsang, Ivor Wai-Hung; Chia, Liang-Tien



Proceedings of the SMBE Tri-National Young Investigators' Workshop 2005. Relaxation of functional constraint on light-independent protochlorophyllide oxidoreductase in Thuja.  


The light-independent protochlorophyllide oxidoreductase (DPOR) plays a key role in the ability of nonflowering plants and algae to synthesize chlorophyll in darkness. This enzyme consists of three subunits encoded by the chlB, chlL, and chlN genes in the plastid genome. Previously, we found a high nonsynonymous substitution rate (dN) of the chlL gene in the lineage of Thuja standishii, a conifer belonging to the Cupressaceae. Here we revealed that the acceleration of dN in the chlL occurred as well in other species of Thuja, Thuja occidentalis and Thuja plicata. In addition, dark-grown seedlings of T. occidentalis were found to exhibit a pale yellowish color, and their chlorophyll concentration was much lower than that of other species of Cupressaceae. The results suggested that the species of Thuja have lost the ability to synthesize chlorophyll in darkness, and the functional constraint on the DPOR would thus be expected to be relaxed in this genus. Therefore, we expected to find that the evolutionary rates of all subunits of DPOR would in this case be accelerated. Sequence analyses of the chlN and chlB (encoding the other subunits of DPOR) in 18 species of Cupressaceae revealed that the dN of the chlN gene was accelerated in Thuja as was the dN of the chlL gene, but the dN of the chlB gene did not appear to differ significantly among the species of Cupressaceae. Sequencing of reverse transcription-polymerase chain reaction (RT-PCR) products of these genes showed that RNA editing was rare and unlikely to have contributed to the acceleration. Moreover, the RT-PCR analysis indicated that all chl genes were still transcriptionally active in T. occidentalis. Based on these results, it appears that species of Thuja still bear the DPOR protein, although the enzyme has lost its activity because of nonsynonymous mutations of some of the chl genes. The lack of acceleration of the dN of the chlB gene might be accounted for by various unknown functions of its gene product. PMID:16428257

Kusumi, Junko; Sato, Aya; Tachida, Hidenori



An investigation of the potential for the use of a high resolution adaptive coded aperture system in the mid-wave infrared  

NASA Astrophysics Data System (ADS)

Previous applications of coded aperture imaging (CAI) have been mainly in the energetic parts of the electro-magnetic spectrum, such as gamma ray astronomy, where few viable imaging alternatives exist. In addition, resolution requirements have typically been low (~ mrad). This paper investigates the prospects for and advantages of using CAI at longer wavelengths (visible, infrared) and at higher resolutions, and also considers the benefits of adaptive CAI techniques. The latter enable CAI to achieve reconfigurable modes of imaging, as well as improving system performance in other ways, such as enhanced image quality. It is shown that adaptive CAI has several potential advantages over more traditional optical systems for some applications in these wavebands. The merits include low mass, volume and moments of inertia, potentially lower costs, graceful failure modes, steerable fields of regard with no macroscopic moving parts and inherently encrypted data streams. Among the challenges associated with this new imaging approach are the effects of diffraction, interference, photon absorption at the mask and the low scene contrasts in the infrared wavebands. The paper analyzes some of these and presents the results of some of the tradeoffs in optical performance, using radiometric calculations to illustrate the consequences in a mid-infrared application. A CAI system requires a decoding algorithm in order to form an image and the paper discusses novel approaches, tailored to longer wavelength operation. The paper concludes by presenting initial experimental results.

Slinger, Chris; Eismann, Michael; Gordon, Neil; Lewis, Keith; McDonald, Gregor; McNie, Mark; Payne, Doug; Ridley, Kevin; Strens, Malcolm; De Villiers, Geoff; Wilson, Rebecca



An investigative study of multispectral data compression for remotely-sensed images using vector quantization and difference-mapped shift-coding  

NASA Technical Reports Server (NTRS)

A study is conducted to investigate the effects and advantages of data compression techniques on multispectral imagery data acquired by NASA's airborne scanners at the Stennis Space Center. The first technique used was vector quantization. The vector is defined in the multispectral imagery context as an array of pixels from the same location from each channel. The error obtained in substituting the reconstructed images for the original set is compared for different compression ratios. Also, the eigenvalues of the covariance matrix obtained from the reconstructed data set are compared with the eigenvalues of the original set. The effects of varying the size of the vector codebook on the quality of the compression and on subsequent classification are also presented. The output data from the Vector Quantization algorithm was further compressed by a lossless technique called Difference-mapped Shift-extended Huffman coding. The overall compression for 7 channels of data acquired by the Calibrated Airborne Multispectral Scanner (CAMS), with an RMS error of 15.8 pixels was 195:1 (0.41 bpp) and with an RMS error of 3.6 pixels was 18:1 (.447 bpp). The algorithms were implemented in software and interfaced with the help of dedicated image processing boards to an 80386 PC compatible computer. Modules were developed for the task of image compression and image analysis. Also, supporting software to perform image processing for visual display and interpretation of the compressed/classified images was developed.

Jaggi, S.



Permutation codes for sources.  

NASA Technical Reports Server (NTRS)

Source encoding techniques based on permutation codes are investigated. For a broad class of distortion measures it is shown that optimum encoding of a source permutation code is easy to instrument even for very long block lengths. Also, the nonparametric nature of permutation encoding is well suited to situations involving unknown source statistics. For the squared-error distortion measure a procedure for generating good permutation codes of a given rate and block length is described. The performance of such codes for a memoryless Gaussian source is compared both with the rate-distortion function bound and with the performance of various quantization schemes. The comparison reveals that permutation codes are asymptotically ideal for small rates and perform as well as the best entropy-coded quantizers presently known for intermediate rates. They can be made to compare favorably at high rates, too, provided the coding delay associated with extremely long block lengths is tolerable.

Berger, T.; Jelinek, F.; Wolf, J. K.



Comet assay in reconstructed 3D human epidermal skin models—investigation of intra- and inter-laboratory reproducibility with coded chemicals  

PubMed Central

Reconstructed 3D human epidermal skin models are being used increasingly for safety testing of chemicals. Based on EpiDerm™ tissues, an assay was developed in which the tissues were topically exposed to test chemicals for 3h followed by cell isolation and assessment of DNA damage using the comet assay. Inter-laboratory reproducibility of the 3D skin comet assay was initially demonstrated using two model genotoxic carcinogens, methyl methane sulfonate (MMS) and 4-nitroquinoline-n-oxide, and the results showed good concordance among three different laboratories and with in vivo data. In Phase 2 of the project, intra- and inter-laboratory reproducibility was investigated with five coded compounds with different genotoxicity liability tested at three different laboratories. For the genotoxic carcinogens MMS and N-ethyl-N-nitrosourea, all laboratories reported a dose-related and statistically significant increase (P < 0.05) in DNA damage in every experiment. For the genotoxic carcinogen, 2,4-diaminotoluene, the overall result from all laboratories showed a smaller, but significant genotoxic response (P < 0.05). For cyclohexanone (CHN) (non-genotoxic in vitro and in vivo, and non-carcinogenic), an increase compared to the solvent control acetone was observed only in one laboratory. However, the response was not dose related and CHN was judged negative overall, as was p-nitrophenol (p-NP) (genotoxic in vitro but not in vivo and non-carcinogenic), which was the only compound showing clear cytotoxic effects. For p-NP, significant DNA damage generally occurred only at doses that were substantially cytotoxic (>30% cell loss), and the overall response was comparable in all laboratories despite some differences in doses tested. The results of the collaborative study for the coded compounds were generally reproducible among the laboratories involved and intra-laboratory reproducibility was also good. These data indicate that the comet assay in EpiDerm™ skin models is a promising model for the safety assessment of compounds with a dermal route of exposure. PMID:24150594

Pfuhler, Stefan



Synthesizing Certified Code  

NASA Technical Reports Server (NTRS)

Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

Whalen, Michael; Schumann, Johann; Fischer, Bernd



Behavioral correlates for the Minnesota Multiphasic Personality Inventory 4-9, 9-4 code types: A case of the emperor's new clothes?  

Microsoft Academic Search

Attempted to demonstrate behavioral correlates of 2 related MMPI code types (4-9, 9-4) with 2 large independent samples of inpatients from state psychiatric facilities (N = 2,869). Sample size permitted subanalyses of the effects of sex, race, and separate code type on results. While earlier investigations have typically found that this code type is given by sociopaths who are irritable,

Malcolm D. Gynther; Harold Altman; Robert W. Warbin



Extension of the supercritical carbon dioxide brayton cycle to low reactor power operation: investigations using the coupled anl plant dynamics code-SAS4A/SASSYS-1 liquid metal reactor code system.  

SciTech Connect

Significant progress has been made on the development of a control strategy for the supercritical carbon dioxide (S-CO{sub 2}) Brayton cycle enabling removal of power from an autonomous load following Sodium-Cooled Fast Reactor (SFR) down to decay heat levels such that the S-CO{sub 2} cycle can be used to cool the reactor until decay heat can be removed by the normal shutdown heat removal system or a passive decay heat removal system such as Direct Reactor Auxiliary Cooling System (DRACS) loops with DRACS in-vessel heat exchangers. This capability of the new control strategy eliminates the need for use of a separate shutdown heat removal system which might also use supercritical CO{sub 2}. It has been found that this capability can be achieved by introducing a new control mechanism involving shaft speed control for the common shaft joining the turbine and two compressors following reduction of the load demand from the electrical grid to zero. Following disconnection of the generator from the electrical grid, heat is removed from the intermediate sodium circuit through the sodium-to-CO{sub 2} heat exchanger, the turbine solely drives the two compressors, and heat is rejected from the cycle through the CO{sub 2}-to-water cooler. To investigate the effectiveness of shaft speed control, calculations are carried out using the coupled Plant Dynamics Code-SAS4A/SASSYS-1 code for a linear load reduction transient for a 1000 MWt metallic-fueled SFR with autonomous load following. No deliberate motion of control rods or adjustment of sodium pump speeds is assumed to take place. It is assumed that the S-CO{sub 2} turbomachinery shaft speed linearly decreases from 100 to 20% nominal following reduction of grid load to zero. The reactor power is calculated to autonomously decrease down to 3% nominal providing a lengthy window in time for the switchover to the normal shutdown heat removal system or for a passive decay heat removal system to become effective. However, the calculations reveal that the compressor conditions are calculated to approach surge such that the need for a surge control system for each compressor is identified. Thus, it is demonstrated that the S-CO{sub 2} cycle can operate in the initial decay heat removal mode even with autonomous reactor control. Because external power is not needed to drive the compressors, the results show that the S-CO{sub 2} cycle can be used for initial decay heat removal for a lengthy interval in time in the absence of any off-site electrical power. The turbine provides sufficient power to drive the compressors. Combined with autonomous reactor control, this represents a significant safety advantage of the S-CO{sub 2} cycle by maintaining removal of the reactor power until the core decay heat falls to levels well below those for which the passive decay heat removal system is designed. The new control strategy is an alternative to a split-shaft layout involving separate power and compressor turbines which had previously been identified as a promising approach enabling heat removal from a SFR at low power levels. The current results indicate that the split-shaft configuration does not provide any significant benefits for the S-CO{sub 2} cycle over the current single-shaft layout with shaft speed control. It has been demonstrated that when connected to the grid the single-shaft cycle can effectively follow the load over the entire range. No compressor speed variation is needed while power is delivered to the grid. When the system is disconnected from the grid, the shaft speed can be changed as effectively as it would be with the split-shaft arrangement. In the split-shaft configuration, zero generator power means disconnection of the power turbine, such that the resulting system will be almost identical to the single-shaft arrangement. Without this advantage of the split-shaft configuration, the economic benefits of the single-shaft arrangement, provided by just one turbine and lower losses at the design point, are more important to the overall cycle performance. Therefore, the single-shaft

Moisseytsev, A.; Sienicki, J. J. (Nuclear Engineering Division)



Independent component analysis of temporal sequences forms place cells  

Microsoft Academic Search

It has been suggested that sensory information processing makes use of a factorial code. It has been shown that the major components of the hippocampal-entorhinal loop can be derived by conjecturing that the task of this loop is forming and encoding independent components (ICs), one type of factorial codes. However, continuously changing environment poses additional requirements on the coding that

András Lörincz; Gábor Szirtes; Bálint Takács; György Buzsáki



INVESTIGATION Independent FLC Mutations as Causes  

E-print Network

thaliana and Capsella rubella Ya-Long Guo,*, Marco Todesco,* Jörg Hagmann,* Sandip Das,*,1 and Detlef, Chinese Academy of Sciences, Beijing 100093, China ABSTRACT Capsella rubella is an inbreeding annual forb different conditions in 20 C. rubella accessions from across the species' range. Similar to A. thaliana

Weigel, Detlef


How do we code the letters of a word when we have to write it? Investigating double letter representation in French.  


How do we code the letters of a word when we have to write it? We examined whether the orthographic representations that the writing system activates have a specific coding for letters when these are doubled in a word. French participants wrote words on a digitizer. The word pairs shared the initial letters and differed on the presence of a double letter (e.g., LISSER/LISTER). The results on latencies, letter and inter-letter interval durations revealed that L and I are slower to write when followed by a doublet (SS) than when not (ST). Doublet processing constitutes a supplementary cognitive load that delays word production. This suggests that word representations code letter identity and quantity separately. The data also revealed that the central processes that are involved in spelling representation cascade into the peripheral processes that regulate movement execution. PMID:24486807

Kandel, Sonia; Peereman, Ronald; Ghimenton, Anna



Application of a multi-block CFD code to investigate the impact of geometry modeling on centrifugal compressor flow field predictions  

Microsoft Academic Search

CFD codes capable of utilizing multi-block grids provide the capability to analyze the complete geometry of centrifugal compressors. Attendant with this increased capability is potentially increased grid setup time and more computational overhead with the resultant increase in wall clock time to obtain a solution. If the increase in difficulty of obtaining a solution significantly improves the solution from that

M. D. Hathaway; J. R. Wood



Phonological coding during reading.  


The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. PMID:25150679

Leinenger, Mallorie



Independent Component Analysis Segmentation Algorithm  

Microsoft Academic Search

In this paper we propose and investigate a new segmentation algorithm called the ICA (independent component analysis) segmentation algorithm and compare it against other existing overlapping strokes segmentation algorithms. The ICA segmentation algorithm converts the original touching or overlapping word components into a blind source matrix and then calculates the weighted value matrix before the values are re-evaluated using a

Yan Chen; Graham Leedham



Coordinated design of coding and modulation systems  

NASA Technical Reports Server (NTRS)

The joint optimization of the coding and modulation systems employed in telemetry systems was investigated. Emphasis was placed on formulating inner and outer coding standards used by the Goddard Spaceflight Center. Convolutional codes were found that are nearly optimum for use with Viterbi decoding in the inner coding of concatenated coding systems. A convolutional code, the unit-memory code, was discovered and is ideal for inner system usage because of its byte-oriented structure. Simulations of sequential decoding on the deep-space channel were carried out to compare directly various convolutional codes that are proposed for use in deep-space systems.

Massey, J. L.; Ancheta, T.; Johannesson, R.; Lauer, G.; Lee, L.



Procedure Codes for SEER-Medicare Analyses

The tables below contain codes for procedures that are frequently included in SEER-Medicare analyses. Please note that NCI provides these codes to assist researchers in analyses. Codes may change or may not be complete. NCI does not accept responsibility for the completeness or currency of the information below. Investigators should check that all relevant codes are included in their analysis.


Sharing code  

PubMed Central

Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing. PMID:25165519

Kubilius, Jonas



GABBA Codes: Generalized Full-Rate Orthogonally Decodable Space-Time Block Codes  

Microsoft Academic Search

Constellation independent, systematically con- structed, orthogonally decodable, full-rate and full-diversity space time block codes (STBCs) generalized to any number of transmit antennas are considered. The proposed codes generalize the ABBA STBC, also known as quasi-orthogonal STBC (QO- STBC) and are referred to as generalized ABBA (GABBA) codes. The construction of GABBA codes is systematic in which the encoding matrix for

G. T. Freitas de Abreu; Freitas de Abreu



CONTAIN independent peer review  

SciTech Connect

The CONTAIN code was developed by Sandia National Laboratories under the sponsorship of the US Nuclear Regulatory Commission (NRC) to provide integrated analyses of containment phenomena. It is used to predict nuclear reactor containment loads, radiological source terms, and associated physical phenomena for a range of accident conditions encompassing both design-basis and severe accidents. The code`s targeted applications include support for containment-related experimental programs, light water and advanced light water reactor plant analysis, and analytical support for resolution of specific technical issues such as direct containment heating. The NRC decided that a broad technical review of the code should be performed by technical experts to determine its overall technical adequacy. For this purpose, a six-member CONTAIN Peer Review Committee was organized and a peer review as conducted. While the review was in progress, the NRC issued a draft ``Revised Severe Accident Code Strategy`` that incorporated revised design objectives and targeted applications for the CONTAIN code. The committee continued its effort to develop findings relative to the original NRC statement of design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications were considered by the Committee in assigning priorities to the Committee`s recommendations. The Committee determined some improvements are warranted and provided recommendations in five code-related areas: (1) documentation, (2) user guidance, (3) modeling capability, (4) code assessment, and (5) technical assessment.

Boyack, B.E. [Los Alamos National Lab., NM (United States); Corradini, M.L. [Univ. of Wisconsin, Madison, WI (United States). Nuclear Engineering Dept.; Denning, R.S. [Battelle Memorial Inst., Columbus, OH (United States); Khatib-Rahbar, M. [Energy Research Inc., Rockville, MD (United States); Loyalka, S.K. [Univ. of Missouri, Columbia, MO (United States); Smith, P.N. [AEA Technology, Dorchester (United Kingdom). Winfrith Technology Center



Using the local gyrokinetic code, GS2, to investigate global ITG modes in tokamaks. (I) s-${\\alpha}$ model with profile and flow shear effects  

E-print Network

This paper combines results from a local gyrokinetic code with analytical theory to reconstruct the global eigenmode structure of the linearly unstable ion-temperature-gradient (ITG) mode with adiabatic electrons. The simulations presented here employ the s-${\\alpha}$ tokamak equilibrium model. Local gyrokinetic calculations, using GS2 have been performed over a range of radial surfaces, x, and for ballooning phase angle, p, in the range -${\\pi} {\\leq} p {\\leq\\pi}$, to map out the complex local mode frequency, ${\\Omega_0(x, p) = \\omega_0(x, p) + i\\gamma_0(x, p)}$. Assuming a quadratic radial profile for the drive, namely ${\\eta_i = L_n/L_T}$, (holding constant all other equilibrium profiles such as safety factor, magnetic shear etc.), ${\\Omega_0(x, p)}$ has a stationary point. The reconstructed global mode then sits on the outboard mid plane of the tokamak plasma, and is known as a conventional or isolated mode, with global growth rate, ${\\gamma}$ ~ Max[${\\gamma_0(x, p)}$], where ${\\gamma_0(x, p)}$ is the loc...

Abdoul, P A; Roach, C M; Wilson, H R



Nevada Nuclear Waste Storage Investigations Project: Unit evaluation at Yucca Mountain, Nevada Test Site: Near-field thermal and mechanical calculations using the SANDIA-ADINA code  

SciTech Connect

Presented in this report are the results of a comparative study of two candidate horizons, the welded, devitrified Topopah Spring Member ofthe Paintbrush Tuff, and the nonwelded, zeolitized Tuffaceous Beds of Calico Hills. The mechanical and thermomechanical response these two horizons was assessed by conducting thermal and thermomechanical calculations using a two-dimensional room and pillar geometry of the vertical waste emplacement option using average and limit properties for each. A modified version of the computer code ADINA (SANDIA-ADINA) containing a material model for rock masses with ubiquitous jointing was used in the calculations. Results of the calculations are presented as the units` capacity for storage of nuclear waste and stability of the emplacement room and pillar due to excavation and long-term heating. A comparison is made with a similar underground opening geometry sited in Grouse Canyon Tuff, using properties obtained from G-Tunnel - a horizon of known excavation characteristics. Long-term stability of the excavated rooms was predicted for all units, as determined by evaluating regions of predicted joint slip as the result of excavation and subsequent thermal loading, evaluating regions of predicted rock matrix failure as the result of excavation and subsequent thermal loading, and evaluating safety factors against rock matrix failure. These results were derived through considering a wide range in material properties and in situ stresses. 21 refs., 21 figs., 5 tabs.

Johnson, R.L.; Bauer, S.J.



The Independence of Reduced Subgroup-State  

NASA Astrophysics Data System (ADS)

Quantum hidden problem being one of the most important quantum computation problems has been widely investigated. Our purpose in this paper is to prove the independent or partial independent of the reduced state derived from the quantum query with the oracle implementation. We prove that if without bias on implementation functions the subgroup state is independent of evaluation functions using the group representation. This result is also used to improve the quantum query success probability.

Luo, Ming-Xing; Deng, Yun



Code constructions and code families for nonbinary quantum stabilizer code  

E-print Network

. . . . . . . . . . . . 6 II CONNECTION BETWEEN CLASSICAL SELF-ORTHOGONAL CODES AND QUANTUM STABILIZER CODES : : : : : : : : 9 A. Error Basis . . . . . . . . . . . . . . . . . . . . . . . . . . 10 B. From Quantum Stabilizer Codes to Classical Codes . . . . 12 1. Connecting... codes) [4, 9, 11]. BCH codes form an extremely important class of error correcting codes. Lemma 16. Let ! be a primitive nth root of unity over Fq and let g(X) be a monic polynomial over Fq of smallest degree that has the 1 numbers !b;!b+1;:::;!b+ 2...

Ketkar, Avanti Ulhas



Polarization independent microphotonic circuits  

E-print Network

Microphotonic circuits have been proposed for applications ranging from optical switching and routing to optical logic circuits. However many applications require microphotonic circuits to be polarization independent, a ...

Watts, Michael Robert, 1974-



Vector coding for partial response channels  

Microsoft Academic Search

A linear technique for combining equalization and coset codes on partial response channels with additive white Gaussian noise is developed. The technique, vector coding, uses a set of transmit filters or `vectors' to partition the channel into an independent set of parallel intersymbol interference (ISI)-free channels for any given finite (or infinite) block length. The optimal transmit vectors for such

Sanjay Kasturia; James T. Aslanis; John M. Cioffi



Profile guided code positioning  

Microsoft Academic Search

This paper presents the results of our investigation of code positioning techniques using execution profile data as input into the compilation process. The primary objective of the positioning is to reduce the overhead of the instruction memory hierarchy.After initial investigation in the literature, we decided to implement two prototypes for the Hewlett-Packard Precision Architecture (PA-RISC). The first, built on top

Karl Pettis; Robert C. Hansen; Jack W. Davidson



Nature's Code  

NASA Astrophysics Data System (ADS)

We propose that the mathematical structures related to the `universal rewrite system' define a universal process applicable to Nature, which we may describe as `Nature's code'. We draw attention here to such concepts as 4 basic units, 64- and 20-unit structures, symmetry-breaking and 5-fold symmetry, chirality, double 3-dimensionality, the double helix, the Van der Waals force and the harmonic oscillator mechanism, and our explanation of how they necessarily lead to self-aggregation, complexity and emergence in higher-order systems. Biological concepts, such as translation, transcription, replication, the genetic code and the grouping of amino acids appear to be driven by fundamental processes of this kind, and it would seem that the Platonic solids, pentagonal symmetry and Fibonacci numbers have significant roles in organizing `Nature's code'.

Hill, Vanessa J.; Rowlands, Peter



Free Code  

NSDL National Science Digital Library

Free Code, a service of Andover.Net, is a large index of Internet-related software tool source code. The tools are written in C/C++, Perl, Java, or Visual Basic, and are free for personal and commercial use. They range from handy Perl CGI scripts to Java-based graphics packages. Each tool in the index is briefly described, characterized by language and operating system, and linked to both the home page for the tool and the source code. The total lack of documentation for the search engine makes useful queries hard to create, but the tools are still easy-to-find. This is a very useful index for anyone building Internet or Web-based applications.


Bursts in m-metric array codes  

Microsoft Academic Search

Fire [P. Fire, A class of multiple-error-correcting binary codes for non-independent errors, Sylvania Reports RSL-E-2, Sylvania Reconnaissance Systems, Mountain View, California, 1959] introduced the concept of bursts for classical codes where codes are subsets\\/subspaces of the space Fqn, the space of all n-tuples with entries from a finite field Fq. In this paper, we introduce the notion of bursts for

Sapna Jain



Linear independence over tropical semirings and beyond  

Microsoft Academic Search

We investigate different notions of linear independence and of matrix rank that are relevant for max-plus or tropical semirings. The factor rank and tropical rank have already received attention, we compare them with the ranks defined in terms of signed tropical determinants or arising from a notion of linear independence introduced by Gondran and Minoux. To do this, we revisit

Marianne Akian; Stephane Gaubert; Alexander Guterman



Investigation of plant control strategies for the supercritical C0Brayton cycle for a sodium-cooled fast reactor using the plant dynamics code  

Microsoft Academic Search

The development of a control strategy for the supercritical CO (S-CO) Brayton cycle has been extended to the investigation of alternate control strategies for a Sodium-Cooled Fast Reactor (SFR) nuclear power plant incorporating a S-CO Brayton cycle power converter. The SFR assumed is the 400 MWe (1000 MWt) ABR-1000 preconceptual design incorporating metallic fuel. Three alternative idealized schemes for controlling

A. Moisseytsev; J. Sienicki



American Independence. Fifth Grade.  

ERIC Educational Resources Information Center

This fifth grade teaching unit covers early conflicts between the American colonies and Britain, battles of the American Revolutionary War, and the Declaration of Independence. Knowledge goals address the pre-revolutionary acts enforced by the British, the concepts of conflict and independence, and the major events and significant people from the…

Crosby, Annette


A preliminary investigation of Large Eddy Simulation (LES) of the flow around a cylinder at ReD = 3900 using a commercial CFD code  

SciTech Connect

Engineering fluid mechanics simulations at high Reynolds numbers have traditionally been performed using the Reynolds-Averaged Navier Stokes (RANS) equations and a turbulence model. The RANS methodology has well-documented shortcomings in the modeling of separated or bluff body wake flows that are characterized by unsteady vortex shedding. The resulting turbulence statistics are strongly influenced by the detailed structure and dynamics of the large eddies, which are poorly captured using RANS models (Rodi 1997; Krishnan et al. 2004). The Large Eddy Simulation (LES) methodology offers the potential to more accurately simulate these flows as it resolves the large-scale unsteady motions and entails modeling of only the smallest-scale turbulence structures. Commercial computational fluid dynamics products are beginning to offer LES capability, allowing practicing engineers an opportunity to apply this turbulence modeling technique to much wider array of problems than in dedicated research codes. Here, we present a preliminary evaluation of the LES capability in the commercial CFD solver StarCD by simulating the flow around a cylinder at a Reynolds number based on the cylinder diameter, D, of 3900 using the constant coefficient Smagorinsky LES model. The results are compared to both the experimental and computational results provided in Kravchenko & Moin (2000). We find that StarCD provides predictions of lift and drag coefficients that are within 15% of the experimental values. Reasonable agreement is obtained between the time-averaged velocity statistics and the published data. The differences in these metrics may be due to the use of a truncated domain in the spanwise direction and the short time-averaging period used for the statistics presented here. The instantaneous flow field visualizations show a coarser, larger-scale structure than the study of Kravchenko & Moin (2000), which may be a product of the LES implementation or of the domain and resolution used. Based on this preliminary study, we conclude that StarCD's LES implementation may useful for low Reynolds number LES computations if proper care is used in the problem and mesh definition.

Paschkewitz, J S



Implementation issues in source coding  

NASA Technical Reports Server (NTRS)

An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.



QR Codes  

ERIC Educational Resources Information Center

This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien



Seals Code Development Workshop  

NASA Technical Reports Server (NTRS)

Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)



Investigating the Influence of the Added Mass Effect to Marine Hydrokinetic Horizontal-Axis Turbines Using a General Dynamic Wake Wind Turbine Code: Preprint  

SciTech Connect

This paper describes a recent study to investigate the applicability of a horizontal-axis wind turbine (HAWT) structural dynamics and unsteady aerodynamics analysis program (FAST and AeroDyn respectively) to modeling the forces on marine hydrokinetic (MHK) turbines. It summarizes the added mass model that has been added to AeroDyn. The added mass model only includes flow acceleration perpendicular to the rotor disc, and ignores added mass forces caused by blade deflection. A model of the National Renewable Energy Laboratory's (NREL) Unsteady Aerodynamics Experiment (UAE) Phase VI wind turbine was analyzed using FAST and AeroDyn with sea water conditions and the new added mass model. The results of this analysis exhibited a 3.6% change in thrust for a rapid pitch case and a slight change in amplitude and phase of thrust for a case with 30 degrees of yaw.

Maniaci, D. C.; Li, Y.



Investigating the Influence of the Added Mass Effect to Marine Hydrokinetic Horizontal-Axis Turbines Using a General Dynamic Wake Wind Turbine Code  

SciTech Connect

This paper describes a recent study to investigate the applicability of a horizontal-axis wind turbine (HAWT) structural dynamics and unsteady aerodynamics analysis program (FAST and AeroDyn respectively) to modeling the forces on marine hydrokinetic (MHK) turbines. This paper summarizes the added mass model that has been added to AeroDyn. The added mass model only includes flow acceleration perpendicular to the rotor disc, and ignores added mass forces caused by blade deflection. A model of the National Renewable Energy Laboratory's (NREL) Unsteady Aerodynamics Experiment (UAE) Phase VI wind turbine was analyzed using FAST and AeroDyn with sea water conditions and the new added mass model. The results of this analysis exhibited a 3.6% change in thrust for a rapid pitch case and a slight change in amplitude and phase of thrust for a case with 30{sup o} of yaw.

Maniaci, D. C.; Li, Y.



The role of coding in the choice between routing and coding for wireless unicast  

E-print Network

coding, for example, using backpressure routing, or using some centralized flow scheduler that is aware the throughput of flooding to backpressure via simulations for a layered network assuming independent losses

Boyer, Edmond


On automatic differentiation of codes with COMPLEX arithmetic with respect to real variables  

SciTech Connect

We explore what it means to apply automatic differentiation with respect to a set of real variables to codes containing complex arithmetic. That is, both dependent and independent variables with respect to differentiation are real variables, but in order to exploit features of complex mathematics, part of the code is expressed by employing complex arithmetic. We investigate how one can apply automatic differentiation to complex variables if one exploits the homomorphism of the complex numbers C onto R{sup 2}. It turns out that, by and large, the usual rules of differentiation apply, but subtle differences in special cases arise for sqrt (), abs (), and the power operator.

Pusch, G.D.; Bischof, C. [Argonne National Lab., IL (United States); Carle, A. [Rice Univ., St. Houston, TX (United States)



Pulsed Inductive Thruster (PIT): Modeling and Validation Using the MACH2 Code  

NASA Technical Reports Server (NTRS)

Numerical modeling of the Pulsed Inductive Thruster exercising the magnetohydrodynamics code, MACH2 aims to provide bilateral validation of the thruster's measured performance and the code's capability of capturing the pertinent physical processes. Computed impulse values for helium and argon propellants demonstrate excellent correlation to the experimental data for a range of energy levels and propellant-mass values. The effects of the vacuum tank wall and massinjection scheme were investigated to show trivial changes in the overall performance. An idealized model for these energy levels and propellants deduces that the energy expended to the internal energy modes and plasma dissipation processes is independent of the propellant type, mass, and energy level.

Schneider, Steven (Technical Monitor); Mikellides, Pavlos G.



Independent GIS Synthesis Project  

NSDL National Science Digital Library

Brian Hynek, University of Colorado Summary A capstone project consisting of independent research and communication of scientific results. Context Type and level of course capstone project for an entry-level GIS ...

Hynek, Brian


Media independent interface  

NASA Technical Reports Server (NTRS)

The work done on the Media Independent Interface (MII) Interface Control Document (ICD) program is described and recommendations based on it were made. Explanations and rationale for the content of the ICD itself are presented.



Compound Independent Events  

NSDL National Science Digital Library

Compare the theoretical and experimental probabilities of compound independent events by drawing colored marbles from a bag. Record the results of successive draws with or without replacement of marbles to calculate the experimental probability.



Code division multiple access using Hermitean codes  

Microsoft Academic Search

Reed-Solomon (RS) codes present some desirable properties that make them useful in the generation of hopping sequences, for frequency hopping code division multiple access (FH CDMA). The algebraic geometric codes include the RS codes as a special case, therefore it is natural to propose the former as a candidate to FH CDMA. In this article, a description of such codes,

Francisco M. Assis; M. S. Alencar



Error-correcting two-dimensional modulation codes  

Microsoft Academic Search

Modulation coding, to limit the number of consecutive zeros in a data stream, is essential in digital magnetic recording\\/playback systems. Additionally, such systems require error-correction coding to ensure that the decoded output matches the recorder input, even if noise is present. Typically, these two coding steps have been performed independently, although various methods of combining them into one step have

Wayne H. Erxleben; Michael W. Marcellin



On generalized low-density parity-check codes based on Hamming component codes  

Microsoft Academic Search

In this paper we investigate a generalization of Gallager's (1963) low-density (LD) parity-check codes, where as component codes single error correcting Hamming codes are used instead of single error detecting parity-check codes. It is proved that there exist such generalized low-density (GLD) codes for which the minimum distance is growing linearly with the block length, and a lower bound of

M. Lentmaier; K. Sh. Zigangirov



Code-Copying in Irano-Turkic.  

ERIC Educational Resources Information Center

Different types of Persian impact on Irano-Turkic language varieties are surveyed and classified according to the Code-Copying model, which implies that copies of elements from foreign codes are inserted, globally or selectively, into a basic code that provides the morphosyntactic frame for the insertion. The study investigates under what…

Johanson, Lars



Approaches to Network Coding for Multiple Unicasts  

Microsoft Academic Search

In this paper, we survey the application of linear network coding to a multiple unicasts scenario in directed graphs. We discuss related work concerning the complexity of the construction of capacity-achieving linear network codes. We briefly review the algebraic formulation of the problem and in the remainder of the paper, we investigate two approaches to construct network codes. One is

Niranjan Ratnakar; Danail Traskov; Ralf Koetter



Thyroid Coding Guidelines

Coding Guidelines THYROID GLAND C739 Coding Hormone Therapy Code Hormone Therapy as 01 for follicular and/or papillary thyroid cancer when thyroid hormone therapy is given. Do not code replacement therapy as treatment unless the tumor is papillary


Lymphoma Coding Guidelines

Coding Guidelines LYMPHOMA M9590/3-M9738/3 See the Hematopoietic and Lymphoid Neoplasm Case Reportability and Coding Manual and the Hematopoietic Database (DB) for more information and coding instructions. First Course of Therapy Do not code


Coding for Satellite Communication  

Microsoft Academic Search

This paper discusses a number of coding techniques for future satellite communication; they include Reed-Solomon error decoding for message blocks, probabilistic decoding techniques for punctured convolutional codes, and planar Euclidean geometry difference set codes for random multiple access applications. The provision of code concatenation, helical interleaving, and simulation results of new punctured convolutional codes are included. A number of coded

William W. Wu; David Haccoun; Robert Peile; Yasuo Hirata



Investigation of compound-independent calibration and partial molecular formula determination by gas chromatography-atomic-emission detection for characterisation of organophosphorus and organosulfur agents related to the chemical weapons convention.  


Atomic-emission detection (AED) is a technique particularly-well suited to screening complex samples for multiple compounds containing heteroatoms such as phosphorus, sulfur, or nitrogen, which are especially relevant in verification of chemical disarmament. Among other GC detectors, AED has unique characteristics such as compound-independent calibration and possible raw-formula determination. Because contradictory results have been reported on these points, we set up a study with the objectives not only of applying these techniques to chemical weapons convention-related chemicals but of determining under which conditions they would yield satisfactory results. The extensive data collected in this study are evidence that the response of the detector, particularly for the phosphorus line, is very dependent on the molecular mass and concentration of the chemicals analysed whereas molecular structure seems to have less effect on the AED signal. Most interestingly, compound-independent calibration and subsequent partial molecular formula determination usually seem satisfactory when the reference compounds used to calibrate the system have GC retention times and molecular masses close to those of the unknown analytes (whose molecular mass may be determined by GC-CI-MS). We therefore suggest the use of a reference set of compounds covering a large chromatographic window, which enables the selection, within this set, of the most appropriate reference compound for calibration and for determination of the raw formula of an unknown analyte. For optimal performance, the use of a new discharge tube is also recommended. PMID:16240110

Juillet, Yannick; Gibert, Edmond; Bégos, Arlette; Bellier, Bruno



Kernel Independent Component Analysis  

Microsoft Academic Search

Abstract We present a class of algorithms for Independent Component Analysis (ICA) which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space. On the one hand, we show that our contrast functions are related to mutual information and have desirable mathematical properties as measures of statistical de- pendence. On the other hand, building on recent developments

Francis R. Bach; Michael I. Jordan



Touchstones of Independence.  

ERIC Educational Resources Information Center

Foundations affiliated with public higher education institutions can avoid having to open records for public scrutiny, by having independent boards of directors, occupying leased office space or paying market value for university space, using only foundation personnel, retaining legal counsel, being forthcoming with information and use of public…

Roha, Thomas Arden



Postcard from Independence, Mo.  

ERIC Educational Resources Information Center

This article reports results showing that the Independence, Missori school district failed to meet almost every one of its improvement goals under the No Child Left Behind Act. The state accreditation system stresses improvement over past scores, while the federal law demands specified amounts of annual progress toward the ultimate goal of 100…

Archer, Jeff



Independence of Velocity  

NSDL National Science Digital Library

This inquiry activity should be completed before discussing with students that a projectile's motion in the vertical direction is independent of its motion in the horizontal direction. As long as students use their apparatus carefully and don't flip coins

Michael Horton



Caring about Independent Lives  

ERIC Educational Resources Information Center

With the rhetoric of independence, new cash for care systems were introduced in many developed welfare states at the end of the 20th century. These systems allow local authorities to pay people who are eligible for community care services directly, to enable them to employ their own careworkers. Despite the obvious importance of the careworker's…

Christensen, Karen



IEAB Independent Analysis Board  

E-print Network

­Effectiveness of Improved Irrigation Efficiency and Water Transactions for Instream Flow for Fish1 Independent Economic for their support and helpful comments. #12;IEAB: Irrigation Efficiency and Water Transactions December 2011 1 Table...................... 11 2.1.4 Aquifers and the Surface Water ­ Groundwater Link


Independent Video in Britain.  

ERIC Educational Resources Information Center

Maintaining the status quo as well as the attitude toward cultural funding and development that it imposes on video are detrimental to the formation of a thriving video network, and also out of key with the present social and political situation in Britain. Independent video has some quite specific advantages as a medium for cultural production…

Stewart, David



Microsoft Academic Search

Speech coding has been and still is a major issue in the area of digital speech processing in which speech compression is needed for storing digital voice and it requires fixed amount of available memory and compression makes it possible to store longer messages. Several techniques of speech coding such as Linear Predictive Coding (LPC), Waveform Coding and Sub band



Concatenated Quantum Codes  

E-print Network

One of the main problems for the future of practical quantum computing is to stabilize the computation against unwanted interactions with the environment and imperfections in the applied operations. Existing proposals for quantum memories and quantum channels require gates with asymptotically zero error to store or transmit an input quantum state for arbitrarily long times or distances with fixed error. In this report a method is given which has the property that to store or transmit a qubit with maximum error $\\epsilon$ requires gates with error at most $c\\epsilon$ and storage or channel elements with error at most $\\epsilon$, independent of how long we wish to store the state or how far we wish to transmit it. The method relies on using concatenated quantum codes with hierarchically implemented recovery operations. The overhead of the method is polynomial in the time of storage or the distance of the transmission. Rigorous and heuristic lower bounds for the constant $c$ are given.

Emanuel Knill; Raymond Laflamme



Concatenated quantum codes  

SciTech Connect

One main problem for the future of practial quantum computing is to stabilize the computation against unwanted interactions with the environment and imperfections in the applied operations. Existing proposals for quantum memories and quantum channels require gates with asymptotically zero error to store or transmit an input quantum state for arbitrarily long times or distances with fixed error. This report gives a method which has the property that to store or transmit a qubit with maximum error {epsilon} requires gates with errors at most {ital c}{epsilon} and storage or channel elements with error at most {epsilon}, independent of how long we wish to store the state or how far we wish to transmit it. The method relies on using concatenated quantum codes and hierarchically implemented recovery operations. The overhead of the method is polynomial in the time of storage or the distance of the transmission. Rigorous and heuristic lower bounds for the constant {ital c} are given.

Knill, E.; Laflamme, R.



Neuronal Adaptation Translates Stimulus Gaps into a Population Code  

PubMed Central

Neurons in sensory pathways exhibit a vast multitude of adaptation behaviors, which are assumed to aid the encoding of temporal stimulus features and provide the basis for a population code in higher brain areas. Here we study the transition to a population code for auditory gap stimuli both in neurophysiological recordings and in a computational network model. Independent component analysis (ICA) of experimental data from the inferior colliculus of Mongolian gerbils reveals that the network encodes different gap sizes primarily with its population firing rate within 30 ms after the presentation of the gap, where longer gap size evokes higher network activity. We then developed a computational model to investigate possible mechanisms of how to generate the population code for gaps. Phenomenological (ICA) and functional (discrimination performance) analyses of our simulated networks show that the experimentally observed patterns may result from heterogeneous adaptation, where adaptation provides gap detection at the single neuron level and neuronal heterogeneity ensures discriminable population codes for the whole range of gap sizes in the input. Furthermore, our work suggests that network recurrence additionally enhances the network's ability to provide discriminable population patterns. PMID:24759970

Yuan, Chun-Wei; Khouri, Leila; Grothe, Benedikt; Leibold, Christian



An introduction to QR Codes: linking libraries and mobile patrons.  


QR codes, or "Quick Response" codes, are two-dimensional barcodes that can be scanned by mobile smartphone cameras. These codes can be used to provide fast access to URLs, telephone numbers, and short passages of text. With the rapid adoption of smartphones, librarians are able to use QR codes to promote services and help library users find materials quickly and independently. This article will explain what QR codes are, discuss how they can be used in the library, and describe issues surrounding their use. A list of resources for generating and scanning QR codes is also provided. PMID:21800986

Hoy, Matthew B



Minimizing correlation effect using zero cross correlation code in spectral amplitude coding optical code division multiple access  

NASA Astrophysics Data System (ADS)

The use of minimal multiple access interference (MAI) in code design is investigated. Applying a projection and mapping techniques, a code that has a zero cross correlation (ZCC) between users in optical code division multiple access (OCDMA) is presented in this paper. The system is based on an incoherent light source—LED, spectral amplitude coding (SAC), and direct detection techniques at the receiver. Using power spectral density (PSD) function and Gaussian approximation, we obtain the signal-to-noise ratio (SNR) and the bit-error rate (BER) to measure the code performance. Making a comparison with other existing codes, e.g., Hadamard, MFH and MDW codes, we show that our code performs better at BER 10-9 in terms of number of simultaneous users. We also demonstrate the comparison between the theoretical and simulation analyses, where the results are close to one another.

Safar, Anuar Mat; Aljunid, Syed Alwee; Arief, Amir Razif; Nordin, Junita; Saad, Naufal



Neural representation of objects in space: a dual coding account.  

PubMed Central

I present evidence on the nature of object coding in the brain and discuss the implications of this coding for models of visual selective attention. Neuropsychological studies of task-based constraints on: (i) visual neglect; and (ii) reading and counting, reveal the existence of parallel forms of spatial representation for objects: within-object representations, where elements are coded as parts of objects, and between-object representations, where elements are coded as independent objects. Aside from these spatial codes for objects, however, the coding of visual space is limited. We are extremely poor at remembering small spatial displacements across eye movements, indicating (at best) impoverished coding of spatial position per se. Also, effects of element separation on spatial extinction can be eliminated by filling the space with an occluding object, indicating that spatial effects on visual selection are moderated by object coding. Overall, there are separate limits on visual processing reflecting: (i) the competition to code parts within objects; (ii) the small number of independent objects that can be coded in parallel; and (iii) task-based selection of whether within- or between-object codes determine behaviour. Between-object coding may be linked to the dorsal visual system while parallel coding of parts within objects takes place in the ventral system, although there may additionally be some dorsal involvement either when attention must be shifted within objects or when explicit spatial coding of parts is necessary for object identification. PMID:9770227

Humphreys, G W



Utilizing sequence intrinsic composition to classify protein-coding and long non-coding transcripts.  


It is a challenge to classify protein-coding or non-coding transcripts, especially those re-constructed from high-throughput sequencing data of poorly annotated species. This study developed and evaluated a powerful signature tool, Coding-Non-Coding Index (CNCI), by profiling adjoining nucleotide triplets to effectively distinguish protein-coding and non-coding sequences independent of known annotations. CNCI is effective for classifying incomplete transcripts and sense-antisense pairs. The implementation of CNCI offered highly accurate classification of transcripts assembled from whole-transcriptome sequencing data in a cross-species manner, that demonstrated gene evolutionary divergence between vertebrates, and invertebrates, or between plants, and provided a long non-coding RNA catalog of orangutan. CNCI software is available at PMID:23892401

Sun, Liang; Luo, Haitao; Bu, Dechao; Zhao, Guoguang; Yu, Kuntao; Zhang, Changhai; Liu, Yuanning; Chen, Runsheng; Zhao, Yi



Independent Lens: Interactive  

NSDL National Science Digital Library

Over the past few years, Independent Lens has produced a number of well-received documentaries that have aired on PBS and other places. They have also created some very nice websites in an attempt to enhance the viewing experience of their programs. The Independent Lens: Interactive site offers some additional web-original projects for the interested public. Some of these features include Beyond the Fire, which introduces visitors to the stories of fifteen teenagers living in the US, who have survived war in seven different regions. One very compelling highlight of the site is the Off the Map feature. Here visitors can learn about the visionary art produced by a selection of persons working in various media, such as bottle caps, matchsticks, and chewing gum. For those looking for something with a unique perspective on the world and its inhabitants, this website will definitely bring a smile to their eyes.



Agent independent task planning  

NASA Technical Reports Server (NTRS)

Agent-Independent Planning is a technique that allows the construction of activity plans without regard to the agent that will perform them. Once generated, a plan is then validated and translated into instructions for a particular agent, whether a robot, crewmember, or software-based control system. Because Space Station Freedom (SSF) is planned for orbital operations for approximately thirty years, it will almost certainly experience numerous enhancements and upgrades, including upgrades in robotic manipulators. Agent-Independent Planning provides the capability to construct plans for SSF operations, independent of specific robotic systems, by combining techniques of object oriented modeling, nonlinear planning and temporal logic. Since a plan is validated using the physical and functional models of a particular agent, new robotic systems can be developed and integrated with existing operations in a robust manner. This technique also provides the capability to generate plans for crewmembers with varying skill levels, and later apply these same plans to more sophisticated robotic manipulators made available by evolutions in technology.

Davis, William S.



Coding for Cooperative Communications  

E-print Network

methods using Raptor codes, which performs within 1.1 dB of the performance limit. Finally, we consider a CRC and develop a practical multi-level dirty-paper coding strategy using LDPC codes for channel coding and trellis-coded quantization for source... . . . . . . . . . . . . . . . . . . . . 48 viii CHAPTER Page 5. CF coding using simplified SWCNSQ . . . . . . . . . 52 D. Practical CF Code Design . . . . . . . . . . . . . . . . . . 53 1. Encoding at the source . . . . . . . . . . . . . . . . . 55 2. DJSCC at the relay...

Uppal, Momin Ayub



Homological stabilizer codes  

SciTech Connect

In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

Anderson, Jonas T., E-mail:



On Joint Source and Channel Coding Using Trellis Coded CPM: Analytical Bounds on the Channel Distortion  

Microsoft Academic Search

Joint source and channel coding (JSCC) using trellis coded quantization (TCQ) in conjunction with trellis coded continuous phase modulation (CPM) is studied. The channel is assumed to be the additive white gaussian noise (AWGN) channel. Analytical bounds on the channel distortion for the investigated systems with maximum-likelihood sequence detection (MLSD) are developed. The bounds are based on the transfer function

Zihuai Lin; Tor Aulin



Characterizing History Independent Data Structures  

E-print Network

Characterizing History Independent Data Structures Jason D. Hartline 1 , Edwin S. Hong 1 history independent data structures as proposed for study by Teague and Naor [2]. In a history independent is available from the abstract data structure. We show that for the most part, strong history independent data

Bustamante, Fabián E.


Characterizing History Independent Data Structures  

E-print Network

Characterizing History Independent Data Structures Jason D. Hartline1 , Edwin S. Hong1 , Alexander history independent data structures as proposed for study by Teague and Naor [2]. In a history independent is available from the abstract data structure. We show that for the most part, strong history independent data

Bustamante, Fabián E.


Model Children's Code.  

ERIC Educational Resources Information Center

The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

New Mexico Univ., Albuquerque. American Indian Law Center.


The Cosmic Code Comparison Project  

E-print Network

Current and upcoming cosmological observations allow us to probe structures on smaller and smaller scales, entering highly nonlinear regimes. In order to obtain theoretical predictions in these regimes, large cosmological simulations have to be carried out. The promised high accuracy from observations make the simulation task very demanding: the simulations have to be at least as accurate as the observations. This requirement can only be fulfilled by carrying out an extensive code validation program. The first step of such a program is the comparison of different cosmology codes including gravitation interactions only. In this paper we extend a recently carried out code comparison project to include five more simulation codes. We restrict our analysis to a small cosmological volume which allows us to investigate properties of halos. For the matter power spectrum and the mass function, the previous results hold, with the codes agreeing at the 10% level over wide dynamic ranges. We extend our analysis to the comparison of halo profiles and investigate the halo count as a function of local density. We introduce and discuss ParaView as a flexible analysis tool for cosmological simulations, the use of which immensely simplifies the code comparison task.

Katrin Heitmann; Zarija Lukic; Patricia Fasel; Salman Habib; Michael S. Warren; Martin White; James Ahrens; Lee Ankeny; Ryan Armstrong; Brian O'Shea; Paul M. Ricker; Volker Springel; Joachim Stadel; Hy Trac



Consistent Comparison of the Codes RELAP5/PARCS and TRAC-M/PARCS for the OECD MSLB Coupled Code Benchmark  

SciTech Connect

A generalized interface module was developed for coupling any thermal-hydraulic code to any spatial kinetic code. In the design used here the thermal-hydraulic and spatial kinetic codes function as independent processes and communicate using the Parallel Virtual Machine software. This approach helps maximize flexibility while minimizing modifications to the respective codes. Using this interface, the U.S. Nuclear Regulatory Commission (NRC) three-dimensional neutron kinetic code, Purdue Advanced Reactor Core Simulator (PARCS), has been coupled to the NRC system analysis codes RELAP5 and Modernized Transient Reactor Analysis Code (TRAC-M). Consistent comparison of code results for the Organization for Economic Cooperation and Development/Nuclear Energy Agency main steam line break benchmark problem using RELAP5/PARCS and TRAC-M/PARCS was made to assess code performance.

Kozlowski, Tomasz [Purdue University (United States); Miller, R. Matthew [Purdue University (United States); Downar, Thomas J. [Purdue University (United States); Barber, Douglas A. [Information Systems Laboratories (United States); Joo, Han Gyu [Korea Atomic Energy Research Institute (Korea, Republic of)



Independent Lens: Butte, America  

NSDL National Science Digital Library

Butte, Montana was a hard rock mining town that supplied the United States with much-needed copper, due to the electrification of the nation. The documentary created by Independent Lens of PBS shows the hardship the miners and their families encountered. The Independent Lens website has a multitude of interactive features that adds depth and increased understanding to the film. To find when and on what PBS station the film is playing, visitors can click the link "Check Local Listings". Under the "The Film" tab, three clips of the film are available, and under "The Making of " tab, visitors can find details the difficulties of the film crew in filming the underground mining tunnels. The filmmaker also addresses the challenges of working in 16mm film, and the painful decisions of what scenes to cut. "Related Links" can also be found at the bottom of "The Film" link and provides links to several articles on the town of Butte, as well as to the filmmaker's website.


Certifying Auto-Generated Flight Code  

NASA Technical Reports Server (NTRS)

Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm itself is generic, and parametrized with respect to a library of coding patterns that depend on the safety policies and the code generator. The patterns characterize the notions of definitions and uses that are specific to the given safety property. For example, for initialization safety, definitions correspond to variable initializations while uses are statements which read a variable, whereas for array bounds safety, definitions are the array declarations, while uses are statements which access an array variable. The inferred annotations are thus highly dependent on the actual program and the properties being proven. The annotations, themselves, need not be trusted, but are crucial to obtain the automatic formal verification of the safety properties without requiring access to the internals of the code generator. The approach has been applied to both in-house and commercial code generators, but is independent of the particular generator used. It is currently being adapted to flight code generated using MathWorks Real-Time Workshop, an automatic code generator that translates from Simulink/Stateflow models into embedded C code.

Denney, Ewen



Generalized Concatenated Quantum Codes  

E-print Network

We introduce the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of new single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length, but also asymptotically achieve the quantum Hamming bound for large block length.

Markus Grassl; Peter Shor; Graeme Smith; John Smolin; Bei Zeng



Codeword Stabilized Quantum Codes  

E-print Network

We present a unifying approach to quantum error correcting code design that encompasses additive (stabilizer) codes, as well as all known examples of nonadditive codes with good parameters. We use this framework to generate new codes with superior parameters to any previously known. In particular, we find ((10,18,3)) and ((10,20,3)) codes. We also show how to construct encoding circuits for all codes within our framework.

Andrew Cross; Graeme Smith; John A. Smolin; Bei Zeng



Melanism in Peromyscus Is Caused by Independent Mutations in Agouti  

PubMed Central

Identifying the molecular basis of phenotypes that have evolved independently can provide insight into the ways genetic and developmental constraints influence the maintenance of phenotypic diversity. Melanic (darkly pigmented) phenotypes in mammals provide a potent system in which to study the genetic basis of naturally occurring mutant phenotypes because melanism occurs in many mammals, and the mammalian pigmentation pathway is well understood. Spontaneous alleles of a few key pigmentation loci are known to cause melanism in domestic or laboratory populations of mammals, but in natural populations, mutations at one gene, the melanocortin-1 receptor (Mc1r), have been implicated in the vast majority of cases, possibly due to its minimal pleiotropic effects. To investigate whether mutations in this or other genes cause melanism in the wild, we investigated the genetic basis of melanism in the rodent genus Peromyscus, in which melanic mice have been reported in several populations. We focused on two genes known to cause melanism in other taxa, Mc1r and its antagonist, the agouti signaling protein (Agouti). While variation in the Mc1r coding region does not correlate with melanism in any population, in a New Hampshire population, we find that a 125-kb deletion, which includes the upstream regulatory region and exons 1 and 2 of Agouti, results in a loss of Agouti expression and is perfectly associated with melanic color. In a second population from Alaska, we find that a premature stop codon in exon 3 of Agouti is associated with a similar melanic phenotype. These results show that melanism has evolved independently in these populations through mutations in the same gene, and suggest that melanism produced by mutations in genes other than Mc1r may be more common than previously thought. PMID:19649329

Kingsley, Evan P.; Manceau, Marie; Wiley, Christopher D.; Hoekstra, Hopi E.



Investigation of the Performance of Various CVD Diamond Crystal Qualities for the Measurement of Radiation Doses from a Low Energy Mammography X-Ray Beam, Compared with MC Code (PENELOPE) Calculations  

NASA Astrophysics Data System (ADS)

The tissue equivalence of diamond allows for accurate radiation dose determination without large corrections for different attenuation values in biological tissue, but its low Z value limits this advantage however to the lower energy photons such as for example in Mammography X-ray beams. This paper assays the performance of nine Chemical Vapour Deposition (CVD) diamonds for use as radiation sensing material. The specimens fabricated in wafer form are classified as detector grade, optical grade and single crystals. It is well known that the presence of defects in diamonds, including CVD specimens, not only dictates but also affects the responds of diamond to radiation in different ways. In this investigation, tools such as electron spin resonance (ESR), thermoluminescence (TL) Raman spectroscopy and ultra violet (UV) spectroscopy were used to probe each of the samples. The linearity, sensitivity and other characteristics of the detector to photon interaction was analyzed, and from the I-V characteristics. The diamonds categorized into four each, of the so called Detector and Optical grades, and a single crystal CVD were exposed to low X-ray peak voltage range (22 to 27 KVp) with a trans-crystal polarizing fields of 0.4, 0.66 and 0.8 The presentation discusses the presence of defects identifiable by the techniques used and correlates the radiation performance of the three types of crystals to their presence. The choice of a wafer as either a spectrometer or as X-ray dosimeter within the selected energy range was made. The analyses was validated with Monte-Carlo code (PENELOPE)

Zakari, Y. I.; Mavunda, R. D.; Nam, T. L.; Keddy, R. J.


Frame independent cosmological perturbations  

SciTech Connect

We compute the third order gauge invariant action for scalar-graviton interactions in the Jordan frame. We demonstrate that the gauge invariant action for scalar and tensor perturbations on one physical hypersurface only differs from that on another physical hypersurface via terms proportional to the equation of motion and boundary terms, such that the evolution of non-Gaussianity may be called unique. Moreover, we demonstrate that the gauge invariant curvature perturbation and graviton on uniform field hypersurfaces in the Jordan frame are equal to their counterparts in the Einstein frame. These frame independent perturbations are therefore particularly useful in relating results in different frames at the perturbative level. On the other hand, the field perturbation and graviton on uniform curvature hypersurfaces in the Jordan and Einstein frame are non-linearly related, as are their corresponding actions and n-point functions.

Prokopec, Tomislav; Weenink, Jan, E-mail:, E-mail: [Institute for Theoretical Physics and Spinoza Institute, Utrecht University, Leuvenlaan 4, 3585 CE Utrecht (Netherlands)



The Comparative Performance of Conditional Independence Indices  

ERIC Educational Resources Information Center

To realize the benefits of item response theory (IRT), one must have model-data fit. One facet of a model-data fit investigation involves assessing the tenability of the conditional item independence (CII) assumption. In this Monte Carlo study, the comparative performance of 10 indices for identifying conditional item dependence is assessed. The…

Kim, Doyoung; De Ayala, R. J.; Ferdous, Abdullah A.; Nering, Michael L.



Animal Behavior: An Independent Research Project  

NSDL National Science Digital Library

This is an independent research project intended for second semester high school biology students. It could easily be modified for any age life science students. The purpose of the project is to allow students to conduct their own animal behavior research investigation, from beginning to end. The process models the way animal research is conducted by research scientists.

Ms. Jeannie Wenndorf (Lindbergh High School)



A planar metamaterial: Polarization independent fishnet structure  

E-print Network

A planar metamaterial: Polarization independent fishnet structure Kamil Boratay Alici a,b,*, Ekmel and experimentally investigate a planar metamaterial that is composed of connected cut-wire pairs and continuous, shorted cut-wire pairs, composite metamaterial, and shorted composite metamaterial. # 2008 Elsevier B

Ozbay, Ekmel


Accumulate repeat accumulate codes  

NASA Technical Reports Server (NTRS)

In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung



An interactive morse code emulation management system  

Microsoft Academic Search

Assistive technology (AT) is becoming increasingly important in improving mobility, language, and learning capabilities of persons who have disabilities enabling them to function independently and to improve their social opportunities. Morse code has been shown to be a valuable tool in Assistive technology, augmentative and alternative communication, rehabilitation, and education, as well as adapted computer access methods via special software

Cheng-Hong Yang



Effect of Color Coding on Cognitive Style.  

ERIC Educational Resources Information Center

The purpose of this study was to examine the effect that coding (black and white or color) has on the achievement of students categorized as field dependent (FD) and field independent (FI) learners and to determine if there was any interaction between these variables (field dependency and color) across both visually and verbally oriented tests…

Dwyer, Francis M.; Moore, David M.


Transionospheric Propagation Code (TIPC)  

SciTech Connect

The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of vhf signals following propagation through the ionosphere. The code is written in Fortran 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, DTOA study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of delta-times-of-arrival (DTOAs) vs TECs for a specified pair of receivers.

Roussel-Dupre, R.; Kelley, T.A.



Ideology Among Independent Voter Groups  

E-print Network

by leaning. I hypothesize that independent leaners tend to have a stronger political ideology which causes them to lean back toward a party. Through analysis of the data from the 2004 Annenberg National Election Survey, I conclude that an independent’s level...

Berry, Meagan



Reusable State Machine Code Generator  

NASA Astrophysics Data System (ADS)

The State Machine model is frequently used to represent the behaviour of a system, allowing one to express and execute this behaviour in a deterministic way. A graphical representation such as a UML State Chart diagram tames the complexity of the system, thus facilitating changes to the model and communication between developers and domain experts. We present a reusable state machine code generator, developed by the Universidad Técnica Federico Santa María and the European Southern Observatory. The generator itself is based on the open source project architecture, and uses UML State Chart models as input. This allows for a modular design and a clean separation between generator and generated code. The generated state machine code has well-defined interfaces that are independent of the implementation artefacts such as the middle-ware. This allows using the generator in the substantially different observatory software of the Atacama Large Millimeter Array and the ESO Very Large Telescope. A project-specific mapping layer for event and transition notification connects the state machine code to its environment, which can be the Common Software of these projects, or any other project. This approach even allows to automatically create tests for a generated state machine, using techniques from software testing, such as path-coverage.

Hoffstadt, A. A.; Reyes, C.; Sommer, H.; Andolfato, L.



Coset Codes Viewed as Terminated Convolutional Codes  

NASA Technical Reports Server (NTRS)

In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.

Fossorier, Marc P. C.; Lin, Shu



Concatenated Coding Using Trellis-Coded Modulation  

NASA Technical Reports Server (NTRS)

In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.

Thompson, Michael W.



Bit-Wise Arithmetic Coding For Compression Of Data  

NASA Technical Reports Server (NTRS)

Bit-wise arithmetic coding is data-compression scheme intended especially for use with uniformly quantized data from source with Gaussian, Laplacian, or similar probability distribution function. Code words of fixed length, and bits treated as being independent. Scheme serves as means of progressive transmission or of overcoming buffer-overflow or rate constraint limitations sometimes arising when data compression used.

Kiely, Aaron



The Syntax and Psycholinguistics of Bilingual Code Mixing.  

ERIC Educational Resources Information Center

This paper challenges the characterization of bilingual behavior derived from the code-switching model, and especially the notion of linguistic independence on which psychological studies of bilingualism have focused almost exclusively. While linguists have concentrated on the situational determinants of code-switching, psychologists have focused…

Sridhar, S. N.; Sridhar, Kamal K.



Interval coding. II. Dendrite-dependent mechanisms.  


The rich temporal structure of neural spike trains provides multiple dimensions to code dynamic stimuli. Popular examples are spike trains from sensory cells where bursts and isolated spikes can serve distinct coding roles. In contrast to analyses of neural coding, the cellular mechanics of burst mechanisms are typically elucidated from the neural response to static input. Bridging the mechanics of bursting with coding of dynamic stimuli is an important step in establishing theories of neural coding. Electrosensory lateral line lobe (ELL) pyramidal neurons respond to static inputs with a complex dendrite-dependent burst mechanism. Here we show that in response to dynamic broadband stimuli, these bursts lack some of the electrophysiological characteristics observed in response to static inputs. A simple leaky integrate-and-fire (LIF)-style model with a dendrite-dependent depolarizing afterpotential (DAP) is sufficient to match both the output statistics and coding performance of experimental spike trains. We use this model to investigate a simplification of interval coding where the burst interspike interval (ISI) codes for the scale of a canonical upstroke rather than a multidimensional stimulus feature. Using this stimulus reduction, we compute a quantization of the burst ISIs and the upstroke scale to show that the mutual information rate of the interval code is maximized at a moderate DAP amplitude. The combination of a reduced description of ELL pyramidal cell bursting and a simplification of the interval code increases the generality of ELL burst codes to other sensory modalities. PMID:17409177

Doiron, Brent; Oswald, Anne-Marie M; Maler, Leonard



Independent Lens Strange Fruit  

NSDL National Science Digital Library

The accompanying website for the Independent Lens film "Strange Fruit", about the famous protest song, allows visitors to hear a clip, or the entire song, of a famous rendition sung Billie Holiday. Strange Fruit is a phrase that actually comes from a poem that was turned into a song, and the song became the most renowned protest song of the 1940s. Visitors unfamiliar with the song will find that the link, "The Film", on the homepage gives an informative several paragraph synopsis and history. It also explains the unusual turns the life of the poet/songwriter took. Visitors should not miss the "Protest Music Overview" link, which provides clips of other protest songs. These protest songs are grouped by time period and the topic of protest for the period. Visitors should start at the beginning with 1776 and slavery, and then just wander through the centuries of music. Some of the clips featured within the different time periods include "Fight The Power" by Public Enemy, "Ohio" by Neil Young, and "We Shall Overcome" sung by Mahalia Jackson.


Linear independence over tropical semirings and beyond  

E-print Network

We investigate different notions of linear independence and of matrix rank that are relevant for max-plus or tropical semirings. The factor rank and tropical rank have already received attention, we compare them with the ranks defined in terms of signed tropical determinants or arising from a notion of linear independence introduced by Gondran and Minoux. To do this, we revisit the symmetrization of the max-plus algebra, establishing properties of linear spaces, linear systems, and matrices over the symmetrized max-plus algebra. In parallel we develop some general technique to prove combinatorial and polynomial identities for matrices over semirings that we illustrate by a number of examples.

Akian, Marianne; Guterman, Alexander



Error-correcting two-dimensional modulation codes  

Microsoft Academic Search

Digital magnetic recording\\/playback systems usually require both runlength-limited (RLL) coding and error correction coding (ECC), and these two steps have typically been performed independently, although various methods of combining them have recently appeared. The recent development of two-dimensional modulation codes, which meet runlength constraints using several parallel recording tracks, has significantly increased the capacity of such channels. In this paper,

Wayne H. Erxleben; Michael W. Marcellin



Transparent self-healing communication networks via diversity coding  

Microsoft Academic Search

The authors present an error control based approach, called diversity coding, to provide nearly instantaneous self-healing digital communication networks. This is achieved by constructing an error-correcting code across logically independent channels and by treating link failures within the framework of an erasure channel model. Diversity coding is more efficient than the existing approaches to self-healing communication networks since it is

Chih-Lin I; Ender Ayanoglu; R. D. Gitlin; J. E. Mazo



Discussion on LDPC Codes and Uplink Coding  

NASA Technical Reports Server (NTRS)

This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio



ARA type protograph codes  

NASA Technical Reports Server (NTRS)

An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)



QR Codes 101  

ERIC Educational Resources Information Center

A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark



Source Code Plagiarism--A Student Perspective  

ERIC Educational Resources Information Center

This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.



New construction of multiwavelength optical orthogonal codes  

Microsoft Academic Search

We investigate multiwavelength optical orthogonal codes (MWOOCs) for optical code-division multiple access. Particularly, we present a new construction method for (mn,?+2,?) MWOOCs with the number of available wavelengths m, codeword length n, and constant Hamming weight ?+2 that have autocorrelation and cross-correlation values not exceeding ?. In the proposed scheme, there is no constraint on the relationship between the number

Ssang-Soo Lee; Seung-Woo Seo



Overlapping codes within protein-coding sequences  

PubMed Central

Genomes encode multiple signals, raising the question of how these different codes are organized along the linear genome sequence. Within protein-coding regions, the redundancy of the genetic code can, in principle, allow for the overlapping encoding of signals in addition to the amino acid sequence, but it is not known to what extent genomes exploit this potential and, if so, for what purpose. Here, we systematically explore whether protein-coding regions accommodate overlapping codes, by comparing the number of occurrences of each possible short sequence within the protein-coding regions of over 700 species from viruses to plants, to the same number in randomizations that preserve amino acid sequence and codon bias. We find that coding regions across all phyla encode additional information, with bacteria carrying more information than eukaryotes. The detailed signals consist of both known and potentially novel codes, including position-dependent secondary RNA structure, bacteria-specific depletion of transcription and translation initiation signals, and eukaryote-specific enrichment of microRNA target sites. Our results suggest that genomes may have evolved to encode extensive overlapping information within protein-coding regions. PMID:20841429

Itzkovitz, Shalev; Hodis, Eran; Segal, Eran



Bit-wise arithmetic coding for data compression  

NASA Technical Reports Server (NTRS)

This article examines the problem of compressing a uniformly quantized independent and identically distributed (IID) source. We present a new compression technique, bit-wise arithmetic coding, that assigns fixed-length codewords to the quantizer output and uses arithmetic coding to compress the codewords, treating the codeword bits as independent. We examine the performance of this method and evaluate the overhead required when used block-adaptively. Simulation results are presented for Gaussian and Laplacian sources. This new technique could be used as the entropy coder in a transform or subband coding system.

Kiely, A. B.



Energy aware network coding in wireless networks  

E-print Network

Energy is one of the most important considerations in designing reliable low-power wireless communication networks. We focus on the problem of energy aware network coding. In particular, we investigate practical energy ...

Shi, Xiaomeng, Ph. D. Massachusetts Institute of Technology



A distributed code for color in natural scenes derived from center-surround filtered cone signals  

PubMed Central

In the retina of trichromatic primates, chromatic information is encoded in an opponent fashion and transmitted to the lateral geniculate nucleus (LGN) and visual cortex via parallel pathways. Chromatic selectivities of neurons in the LGN form two separate clusters, corresponding to two classes of cone opponency. In the visual cortex, however, the chromatic selectivities are more distributed, which is in accordance with a population code for color. Previous studies of cone signals in natural scenes typically found opponent codes with chromatic selectivities corresponding to two directions in color space. Here we investigated how the non-linear spatio-chromatic filtering in the retina influences the encoding of color signals. Cone signals were derived from hyper-spectral images of natural scenes and preprocessed by center-surround filtering and rectification, resulting in parallel ON and OFF channels. Independent Component Analysis (ICA) on these signals yielded a highly sparse code with basis functions that showed spatio-chromatic selectivities. In contrast to previous analyses of linear transformations of cone signals, chromatic selectivities were not restricted to two main chromatic axes, but were more continuously distributed in color space, similar to the population code of color in the early visual cortex. Our results indicate that spatio-chromatic processing in the retina leads to a more distributed and more efficient code for natural scenes. PMID:24098289

Kellner, Christian J.; Wachtler, Thomas



List of codes Language abbreviation codes  

E-print Network

EDU.3 Secondary school (general/vocational) EDU.4 Higher education institution EDU.5 AdultList of codes Language abbreviation codes DA Danish GR Greek DE German IT Italian EN English NL or continuing education provider ASS.1 Non-profit association (regional / national) ASS.2 Non-profit association


Click Coding Coding with Click Modular Router  

E-print Network

;Outline Coding Tools Writing custom elements The Click STL Packet Manipulation Timers and Tasks Handlers threshold, larger: drop packet Download the source code online to avoid copy errors at http The Click STL Packet Manipulation Timers and Tasks Handlers References Element header Necessary


QRishing: The Susceptibility of Smartphone Users to QR Code Phishing Attacks  

E-print Network

, Security, QR Code, Smartphone 1 Introduction A Quick Response code (QR code) is a two-dimensional matrix investigated the viability of QR- code-initiated phishing attacks, or QRishing, by conducting two exper- imentsQRishing: The Susceptibility of Smartphone Users to QR Code Phishing Attacks Timothy Vidas

Tague, Patrick


Growing Salt: An Independent Course Research Project Investigating Chemical Sediments  

NSDL National Science Digital Library

To prepare for this project, students read a journal article about the processes and products of chemical sedimentation and early diagenesis in saline pan environments (Lowenstein and Hardie, 1985). In class, students are given some handouts that tabluate various evaporite minerals and how water chemistry affects their formation and dissolution. A short slide show and video illustrate some different types of saline environments. Photos and samples guide a lecture on the formation of different types of evaporite minerals and how they form. For example, chevron halite crystals are generally large (cm-scale) and grow upward from the floor of a shallow (less than ~0.5 m) surface water body; cumulate halite crystals are smaller (typically mm-scale) and grow on the water-air interface and settle to the bottom, regardless of water depth. Randomly-oriented halite crystals can grow displacively from groundwater in mud or sand. The students learn that the specific sedimentology of halite can be used to trace past surface water depth and groundwater salinity. I also give examples of how past quantitative climate data, past chemical data and even past microbiologial data can be interpreted from evaporites. I emphasize how, in order to understand evaporites, one must think critically about sedimentology and geochemistry. The students are told, at the end of this lecture, that their next lab period will focus on designing and setting up a research project on growing salt. They are encouraged to start thinking about a research question they can pose about evaporite sedimentology. At this time, I also tell them what materials are available for their use (tap water, distilled water, seawater, various types of saline water I have collected during field trips, various types of store-bought table and road salt (including iodized, non-iodized, sea salt, etc.). A variety of table salts can be purchased cheaply (~$1 - $2/carton) at almost any grocery store. If you live in a cold climate, most grocery stores and hardware stores also sell several types of road salt (~$3-$4/bag). The table salts are mostly Na and Cl; some have lesser amopunts of Ca and SO4. Some road salts have Ca, Mg, Na, and Cl. In my experience, one carton and one bag of each type will provide more than enough salt for a class of 15 students. When it is time for lab to begin, I gather my students in my research lab (but could also be done in a classroom), where I show them the materials I have available to them: various types of salt, various types of water, and plastic, glass, and metal containers of various shapes (baby food glass jars, plastic take-out food containers, etc). My lab also contains a variety of other miscellaneous materials, such as sand, gravel, clay, morter and pestle, wooden sticks, metal stirring rods, string, plastic tubing, beakers, food coloring (shows fluid inclusion bands well and everyone loves playing with food coloring), etc. I remind the students that they have a microwave oven, a freezer, a lab hood, a windowsill with plenty of sunlight, and a heating vent that can be used, as well. I make available a few thermometers, pH strips (or pH meter), and a hand-held refractometer for measuring salinity. These analytical field instruments are not neccessary for this assignment to work. However, as instructor, I would encourage you to use anything available to you. I ask each student to tell me informally of their research question/hypothesis and then I try to help them find any materials they need for their experiments. Here are some examples of student research questions that have been tested with this assignment: (1) Does temperature of water affect rate of haite/gypsum growth?: (2) Will evaporite minerals grown from a complex saline fluid form a "bulls eye" pattern as their textbook claims?; (3) Will halite grow preferentially on glass substrates versus wooden and plastic substrates?; (4) Will evaporation of salt water make halite cement equally well in a gravel, a sand, a clay?; (5) What conditions best produce large halite crystals?; (6) Does pH of water influence halite and gypsum precipitation or dissolution? Students spend most of a lab period (2-3 hours) setting up their experiment. As part of this initial experimental set-up, they start to learn basic research skills such as labelling samples well, documenting starting conditions, and taking detailed notes. The students are allowed to leave their experiments on a windowsill in my lab or our classroom, on a radiator, in a lab fume hood, or in a lab refridgerator or freezer, depending upon the nature of the particular experiment. I encourage the students to check their samples on a daily basis and remind them to record their observations each time they check their experiment. I give the students an assignment sheet that details the final lab report requirements. Most students will have results in 2-3 weeks, but some experiments may last up to 4-5 weeks. For this reason, I plan for this lab assignment to be started in the middle of the semester (which works well if your syllabus, like mine, calls for weathering, physical sedimentology, siliciclastics, and carbonates to be covered in the first 6-8 weeks of class; evaporites follow well after carbonates). The final lab report is not due until the end of the semester so that all students have time to bring their expermient to completion, make interpretations, and write their lab report. At the end of the semester, depending on the number of students and time permitted, I ask the students to informally tell the class about their experiment and show the results. This has worked well for me. However, even in semesters in which we have not done this, the students still become familiar with each other's projects. On the initial experiment day, the students informally share their ideas. As students come to check on their own experiiments periodically, they usually look in on their classmates' experiments as well. Students tell me that this is one of their favorite lab exercises. It encourages critical thinking and shows the importance of experimentation in science. In addition, I feel as if the students leave my course knowing more about evaporites than the average geologist.

Benison, Kathy


The National Transport Code Collaboration Module Library  

NASA Astrophysics Data System (ADS)

This paper reports on the progress in developing a library of code modules under the auspices of the National Transport Code Collaboration (NTCC). Code modules are high quality, fully documented software packages with a clearly defined interface. The modules provide a variety of functions, such as implementing numerical physics models; performing ancillary functions such as I/O or graphics; or providing tools for dealing with common issues in scientific programming such as portability of Fortran codes. Researchers in the plasma community submit code modules, and a review procedure is followed to insure adherence to programming and documentation standards. The review process is designed to provide added confidence with regard to the use of the modules and to allow users and independent reviews to validate the claims of the modules' authors. All modules include source code; clear instructions for compilation of binaries on a variety of target architectures; and test cases with well-documented input and output. All the NTCC modules and ancillary information, such as current standards and documentation, are available from the NTCC Module Library Website The goal of the project is to develop a resource of value to builders of integrated modeling codes and to plasma physics researchers generally. Currently, there are more than 40 modules in the module library.

Kritz, A. H.; Bateman, G.; Kinsey, J.; Pankin, A.; Onjun, T.; Redd, A.; McCune, D.; Ludescher, C.; Pletzer, A.; Andre, R.; Zakharov, L.; Lodestro, L.; Pearlstein, L. D.; Jong, R.; Houlberg, W.; Strand, P.; Wiley, J.; Valanju, P.; John, H. St.; Waltz, R.; Mandrekas, J.; Mau, T. K.; Carlsson, J.; Braams, B.



A robust low-rate coding scheme for packet video  

NASA Technical Reports Server (NTRS)

Due to the rapidly evolving field of image processing and networking, video information promises to be an important part of telecommunication systems. Although up to now video transmission has been transported mainly over circuit-switched networks, it is likely that packet-switched networks will dominate the communication world in the near future. Asynchronous transfer mode (ATM) techniques in broadband-ISDN can provide a flexible, independent and high performance environment for video communication. For this paper, the network simulator was used only as a channel in this simulation. Mixture blocking coding with progressive transmission (MBCPT) has been investigated for use over packet networks and has been found to provide high compression rate with good visual performance, robustness to packet loss, tractable integration with network mechanics and simplicity in parallel implementation.

Chen, Y. C.; Sayood, Khalid; Nelson, D. J.; Arikan, E. (editor)



Coding with side information  

E-print Network

analogy between SWCQ and entropy coded quantization in classic source coding. Furthermore, a practical scheme of SWCQ using 1-D nested lattice quantization and LDPC is implemented. For GPC, since the actual design procedure relies on the more precise...

Cheng, Szeming



International Code Council  

NSDL National Science Digital Library

The International Code Council is �a membership association dedicated to building safety and fire prevention, develops the codes used to construct residential and commercial buildings, including homes and schools. Most U.S. cities, counties and states that adopt codes choose the International Codes developed by the International Code Council.� Although some sections of the site are reserved for members only (which requires a fee), there is a remarkable amount of material available to non-members. Available on the website are details about codes development, how to acquire an opinion on a code from multiple sources and how to reach a building code liaison for your locality. Under the �Certification and Testing� tab, users can find sample certification exam questions as well as outlines. The site also provides links to various periodicals, ICC meetings and also includes an event calendar to see dates for industry conferences and upcoming trade shows.


Cellulases and coding sequences  


The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

Li, Xin-Liang (Athens, GA); Ljungdahl, Lars G. (Athens, GA); Chen, Huizhong (Lawrenceville, GA)



Cellulases and coding sequences  


The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

Li, Xin-Liang (Athens, GA); Ljungdahl, Lars G. (Athens, GA); Chen, Huizhong (Lawrenceville, GA)



Honesty and Honor Codes.  

ERIC Educational Resources Information Center

Explores the rise in student cheating and evidence that students cheat less often at schools with an honor code. Discusses effective use of such codes and creation of a peer culture that condemns dishonesty. (EV)

McCabe, Donald; Trevino, Linda Klebe



P-code enhanced method for processing encrypted GPS signals without knowledge of the encryption code  

NASA Technical Reports Server (NTRS)

In the preferred embodiment, an encrypted GPS signal is down-converted from RF to baseband to generate two quadrature components for each RF signal (L1 and L2). Separately and independently for each RF signal and each quadrature component, the four down-converted signals are counter-rotated with a respective model phase, correlated with a respective model P code, and then successively summed and dumped over presum intervals substantially coincident with chips of the respective encryption code. Without knowledge of the encryption-code signs, the effect of encryption-code sign flips is then substantially reduced by selected combinations of the resulting presums between associated quadrature components for each RF signal, separately and independently for the L1 and L2 signals. The resulting combined presums are then summed and dumped over longer intervals and further processed to extract amplitude, phase and delay for each RF signal. Precision of the resulting phase and delay values is approximately four times better than that obtained from straight cross-correlation of L1 and L2. This improved method provides the following options: separate and independent tracking of the L1-Y and L2-Y channels; separate and independent measurement of amplitude, phase and delay L1-Y channel; and removal of the half-cycle ambiguity in L1-Y and L2-Y carrier phase.

Meehan, Thomas K. (Inventor); Thomas, Jr., Jess Brooks (Inventor); Young, Lawrence E. (Inventor)



Azerbaijani-Russian Code-Switching and Code-Mixing: Form, Function, and Identity  

ERIC Educational Resources Information Center

From incorporation into the Russian Empire in 1828, through the collapse of the U.S.S.R. in 1991 governmental language policies and other socio/political forces influenced the Turkic population of the Republic of Azerbaijan to speak Russian. Even with changes since independence Russian use--including various kinds of code-switching and…

Zuercher, Kenneth



Independent Learning Models: A Comparison.  

ERIC Educational Resources Information Center

Five models of independent learning are suitable for use in adult education programs. The common factor is a facilitator who works in some way with the student in the learning process. They display different characteristics, including the extent of independence in relation to content and/or process. Nondirective tutorial instruction and learning…

Wickett, R. E. Y.


Independent Study, an Annotated Bibliography.  

ERIC Educational Resources Information Center

This annotated bibliography on independent study lists 150 books, pamphlets, and articles published between 1929 and 1966, with most of the entries dated after 1960. Entries also cover independent study in relation to team teaching, nongraded schools, instructional materials centers, individualized instruction, flexible scheduling, curriculum…

Davis, Harold S.


Morse Code Activity Packet.  

ERIC Educational Resources Information Center

This activity packet offers simple directions for setting up a Morse Code system appropriate to interfacing with any of several personal computer systems. Worksheets are also included to facilitate teaching Morse Code to persons with visual or other disabilities including blindness, as it is argued that the code is best learned auditorily. (PB)

Clinton, Janeen S.


Density and Bounds for Grassmannian Codes with Chordal Distance  

E-print Network

Density and Bounds for Grassmannian Codes with Chordal Distance Renaud-Alexandre Pitaval, Olav. Abstract--We investigate the density of codes in the complex Grassmann manifolds GC n,p equipped with the chordal distance. The density of a code is defined as the fraction of the Grass- mannian covered

Blostein, Steven D.


Topological approach toward quantum codes with realistic physical constraints  

E-print Network

The following open problems, which concern a fundamental limit on coding properties of quantum codes with realistic physical constraints, are analyzed and partially answered here: (a) the upper bound on code distances of quantum error-correcting codes with geometrically local generators, (b) the feasibility of a self-correcting quantum memory. To investigate these problems, we study stabilizer codes supported by local interaction terms with translation and scale symmetries on a $D$-dimensional lattice. Our analysis uses the notion of topology emerging in geometric shapes of logical operators, which sheds a surprising new light on theory of quantum codes with physical constraints.

Beni Yoshida



Color code identification in coded structured light.  


Color code is widely employed in coded structured light to reconstruct the three-dimensional shape of objects. Before determining the correspondence, a very important step is to identify the color code. Until now, the lack of an effective evaluation standard has hindered the progress in this unsupervised classification. In this paper, we propose a framework based on the benchmark to explore the new frontier. Two basic facets of the color code identification are discussed, including color feature selection and clustering algorithm design. First, we adopt analysis methods to evaluate the performance of different color features, and the order of these color features in the discriminating power is concluded after a large number of experiments. Second, in order to overcome the drawback of K-means, a decision-directed method is introduced to find the initial centroids. Quantitative comparisons affirm that our method is robust with high accuracy, and it can find or closely approach the global peak. PMID:22859022

Zhang, Xu; Li, Youfu; Zhu, Limin



Coding for Electronic Mail  

NASA Technical Reports Server (NTRS)

Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

Rice, R. F.; Lee, J. J.



Modular optimization code package: MOZAIK  

NASA Astrophysics Data System (ADS)

This dissertation addresses the development of a modular optimization code package, MOZAIK, for geometric shape optimization problems in nuclear engineering applications. MOZAIK's first mission, determining the optimal shape of the D2O moderator tank for the current and new beam tube configurations for the Penn State Breazeale Reactor's (PSBR) beam port facility, is used to demonstrate its capabilities and test its performance. MOZAIK was designed as a modular optimization sequence including three primary independent modules: the initializer, the physics and the optimizer, each having a specific task. By using fixed interface blocks among the modules, the code attains its two most important characteristics: generic form and modularity. The benefit of this modular structure is that the contents of the modules can be switched depending on the requirements of accuracy, computational efficiency, or compatibility with the other modules. Oak Ridge National Laboratory's discrete ordinates transport code TORT was selected as the transport solver in the physics module of MOZAIK, and two different optimizers, Min-max and Genetic Algorithms (GA), were implemented in the optimizer module of the code package. A distributed memory parallelism was also applied to MOZAIK via MPI (Message Passing Interface) to execute the physics module concurrently on a number of processors for various states in the same search. Moreover, dynamic scheduling was enabled to enhance load balance among the processors while running MOZAIK's physics module thus improving the parallel speedup and efficiency. In this way, the total computation time consumed by the physics module is reduced by a factor close to M, where M is the number of processors. This capability also encourages the use of MOZAIK for shape optimization problems in nuclear applications because many traditional codes related to radiation transport do not have parallel execution capability. A set of computational models based on the existing beam port configuration of the Penn State Breazeale Reactor (PSBR) was designed to test and validate the code package in its entirety, as well as its modules separately. The selected physics code, TORT, and the requisite data such as source distribution, cross-sections, and angular quadratures were comprehensively tested with these computational models. The modular feature and the parallel performance of the code package were also examined using these computational models. Another outcome of these computational models is to provide the necessary background information for determining the optimal shape of the D2O moderator tank for the new beam tube configurations for the PSBR's beam port facility. The first mission of the code package was completed successfully by determining the optimal tank shape which was sought for the current beam tube configuration and two new beam tube configurations for the PSBR's beam port facility. The performance of the new beam tube configurations and the current beam tube configuration were evaluated with the new optimal tank shapes determined by MOZAIK. Furthermore, the performance of the code package with the two different optimization strategies were analyzed showing that while GA is capable of achieving higher thermal beam intensity for a given beam tube setup, Min-max produces an optimal shape that is more amenable to machining and manufacturing. The optimal D2O moderator tank shape determined by MOZAIK with the current beam port configuration improves the thermal neutron beam intensity at the beam port exit end by 9.5%. Similarly, the new tangential beam port configuration (beam port near the core interface) with the optimal moderator tank shape determined by MOZAIK improves the thermal neutron beam intensity by a factor of 1.4 compared to the existing beam port configuration (with the existing D2O moderator tank). Another new beam port configuration, radial beam tube configuration, with the optimal moderator tank shape increases the thermal neutron beam intensity at the beam tube exit by a factor of 1.8. All these results

Bekar, Kursat B.


Active coded aperture neutron imaging  

Microsoft Academic Search

Because of their penetrating power, energetic neutrons and gamma rays (>~1 MeV) offer the best possibility of detecting highly shielded or distant special nuclear material (SNM). Of these, fast neutrons offer the greatest advantage due to their very low and well understood natural background. We are investigating a wholly new approach to fast-neutron imaging-an active coded-aperture system that uses a

Peter Marleau; James Brennan; Erik Brubaker; Nathan Hilton; John Steele



Sequence independent amplification of DNA  


The present invention is a rapid sequence-independent amplification procedure (SIA). Even minute amounts of DNA from various sources can be amplified independent of any sequence requirements of the DNA or any a priori knowledge of any sequence characteristics of the DNA to be amplified. This method allows, for example, the sequence independent amplification of microdissected chromosomal material and the reliable construction of high quality fluorescent in situ hybridization (FISH) probes from YACs or from other sources. These probes can be used to localize YACs on metaphase chromosomes but also--with high efficiency--in interphase nuclei. 25 figs.

Bohlander, S.K.



Sequence independent amplification of DNA  


The present invention is a rapid sequence-independent amplification procedure (SIA). Even minute amounts of DNA from various sources can be amplified independent of any sequence requirements of the DNA or any a priori knowledge of any sequence characteristics of the DNA to be amplified. This method allows, for example the sequence independent amplification of microdissected chromosomal material and the reliable construction of high quality fluorescent in situ hybridization (FISH) probes from YACs or from other sources. These probes can be used to localize YACs on metaphase chromosomes but also--with high efficiency--in interphase nuclei.

Bohlander, Stefan K. (Chicago, IL)



Generating code adapted for interlinking legacy scalar code and extended vector code  


Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

Gschwind, Michael K



Improved codes for space-time trellis-coded modulation  

Microsoft Academic Search

Space-Time Coded Modulation has been shown to eciently use transmit diversity to increase spectral efficiency. In this letter we propose new trellis codes found through systematic code search. These codes achieve the theoretically maximal diversity gain and improved coding gain compared to known codes.

Stephan Bäro; Gerhard Bauch; Axel Hansmann



Technology for Independent Living: Sourcebook.  

ERIC Educational Resources Information Center

This sourcebook provides information for the practical implementation of independent living technology in the everyday rehabilitation process. "Information Services and Resources" lists databases, clearinghouses, networks, research and development programs, toll-free telephone numbers, consumer protection caveats, selected publications, and…

Enders, Alexandra, Ed.


Compound Independent and Dependent Events  

NSDL National Science Digital Library

Compare the theoretical and experimental probability of a compound independent event by drawing colored marbles from a bag. Record the results of successive draws with or without replacement of marbles to calculate the experimental probability.



Spina Bifida Experience: Developing Independence  


... Independence Your browser does not support iFrames. Related Videos Spina Bifida Experience Managing Your Own Medical Care The Importance of Physical Activity More Videos Video Archive Find more NCBDDD videos to watch ...


Independent component analysis: recent advances  

PubMed Central

Independent component analysis is a probabilistic method for learning a linear transform of a random vector. The goal is to find components that are maximally independent and non-Gaussian (non-normal). Its fundamental difference to classical multi-variate statistical methods is in the assumption of non-Gaussianity, which enables the identification of original, underlying components, in contrast to classical methods. The basic theory of independent component analysis was mainly developed in the 1990s and summarized, for example, in our monograph in 2001. Here, we provide an overview of some recent developments in the theory since the year 2000. The main topics are: analysis of causal relations, testing independent components, analysing multiple datasets (three-way data), modelling dependencies between the components and improved methods for estimating the basic model. PMID:23277597

Hyvärinen, Aapo



Independent Schools: Landscape and Learnings.  

ERIC Educational Resources Information Center

Examines American independent schools (parochial, southern segregated, and private institutions) in terms of their funding, expenditures, changing enrollment patterns, teacher-student ratios, and societal functions. Journal available from Daedalus Subscription Department, 1172 Commonwealth Ave., Boston, MA 02132. (AM)

Oates, William A.



MHDust: A 3-fluid dusty plasma code  

NASA Astrophysics Data System (ADS)

MHDust is a next generation 3-fluid magnetized dusty plasma code, treating the inertial dynamics of both the dust and ion components. Coded in ANSI C, the numerical method employs Leap-Frog and Dufort-Frankel integration schemes. Features include: nonlinear collisional terms, quasi-neutrality or continuity based electron densities, and dynamical dust charge number. Tests of wave-mode propagation (Acoustic and Electromagnetic) allow a comparison to linear wave mode theory. Additional nonlinear phenomena are presented including magnetic reconnection and shear-flow instabilities. Relevant parameters for the space environment are considered, allowing a comparison to be made with previous dusty plasma codes (DENISIS). The utility of the code is expanded through the possibility of small dust mass. This allows MH- Dust to be used as a 2-ion plasma code. MHDust considerably expands the range of numerical investigations into nonlinear phenomena in the field of astrophysical dusty plasmas.

Lazerson, Samuel


Experimental interference of independent photons  

E-print Network

Interference of photons emerging from independent sources is essential for modern quantum information processing schemes, above all quantum repeaters and linear-optics quantum computers. We report an observation of non-classical interference of two single photons originating from two independent, separated sources, which were actively synchronized with an r.m.s. timing jitter of 260 fs. The resulting (two-photon) interference visibility was 83(+/-)4 %.

Rainer Kaltenbaek; Bibiane Blauensteiner; Marek Zukowski; Markus Aspelmeyer; Anton Zeilinger



Explaining post-independence conflict  

Microsoft Academic Search

Why do some successor states experience conflict with their predecessor states, while others do not? Are there any notable asymmetries between the two? How important are territorial claims and cross-border ethnic ties in this respect? Are treaties effective in ensuring post-independence peace? Examining 166 instances of independence in the 1816–2001 period, this study shows that territorial claims and cross-border ethnic

Uri Resnick



Unequal error protection of subband coded bits  

E-print Network

system l. Ease of regeneration of digital signals thus minimizing an increase of distortion with distance. 2. Use of Error Detection and Error Correction schemes leading to low error rates 3. Use of encryption for data security Digital Systems... considerations of the other. All the source coding algorithms designed for noiseiless channels are good examples of this approach. The big disadvantage of this independent design is that such implementations do not predict how complex an overall communications...

Devalla, Badarinath



Cartography from Code...? Barend Kbben  

E-print Network

Cartography from Code...? Barend Köbben ITC ­ Universiteit Twente ­ @barendkobben #12;Cartography from Code...? or "how I learned to stop worrying and love coding in cartography;possible...? CARTOGRAPHY FROM CODE #12;...or an oxymoron? CARTOGRAPHY FROM CODE #12;CARTOGRAPHY FROM CODE

Köbben, Barend


Population coding of affect across stimuli, modalities and individuals  

PubMed Central

It remains unclear how the brain represents external objective sensory events alongside our internal subjective impressions of them—affect. Representational mapping of population level activity evoked by complex scenes and basic tastes uncovered a neural code supporting a continuous axis of pleasant-to-unpleasant valence. This valence code was distinct from low-level physical and high-level object properties. While ventral temporal and anterior insular cortices supported valence codes specific to vision and taste, both the medial and lateral orbitofrontal cortices (OFC), maintained a valence code independent of sensory origin. Further only the OFC code could classify experienced affect across participants. The entire valence spectrum is represented as a collective pattern in regional neural activity as sensory-specific and abstract codes, whereby the subjective quality of affect can be objectively quantified across stimuli, modalities, and people. PMID:24952643

Chikazoe, Junichi; Lee, Daniel H.; Kriegeskorte, Nikolaus; Anderson, Adam K.



Research on Universal Combinatorial Coding  

PubMed Central

The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value. PMID:24772019

Lu, Jun; Zhang, Zhuo; Mo, Juan



PLUTO: a Numerical Code for Computational Astrophysics  

E-print Network

We present a new numerical code, PLUTO, for the solution of hypersonic flows in 1, 2 and 3 spatial dimensions and different systems of coordinates. The code provides a multi-physics, multi-algorithm modular environment particularly oriented towards the treatment of astrophysical flows in presence of discontinuities. Different hydrodynamic modules and algorithms may be independently selected to properly describe Newtonian, relativistic, MHD or relativistic MHD fluids. The modular structure exploits a general framework for integrating a system of conservation laws, built on modern Godunov-type shock-capturing schemes. Although a plethora of numerical methods has been successfully developed over the past two decades, the vast majority shares a common discretization recipe, involving three general steps: a piecewise polynomial reconstruction followed by the solution of Riemann problems at zone interfaces and a final evolution stage. We have checked and validated the code against several benchmarks available in literature. Test problems in 1, 2 and 3 dimensions are discussed.

A. Mignone; G. Bodo; S. Massaglia; T. Matsakos; O. Tesileanu; C. Zanni; A. Ferrari



Validation of the reactor dynamics code TRAB  

NASA Astrophysics Data System (ADS)

The validation of the one dimensional reactor dynamics code TRAB (Transient Analysis code for BWR's) is summarized. TRAB was validated with benchmark problems, comparative calculations against independent analyses, analyses of start up experiments of nuclear power plants, and real plant transients. The initial power excursion of the Chernobyl reactor accident was calculated with TRAB. TRAB was originally designed for BWR analyses, but it can in its present version be used for various modeling purposes. The core model of TRAB can be used separately for LWR calculations. For PWR modeling the core model of TRAB was coupled to circuit model SMABRE to form the SMATRA code. The versatile modeling capabilities of TRAB were used in analyses of e.g., the heating reactor SECURE and the RBMK type reactor (Chernobyl).

Raety, Hanna; Kyrki-Rajamaeki, Riitta; Rajamaeki, Markku



Theory of epigenetic coding.  


The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward. PMID:6748695

Elder, D



Codes of Ethics Online  

NSDL National Science Digital Library

The Center for the Study of Ethics in the Professions at the Illinois Institute of Technology maintains the Codes of Ethics Online Web site. The Center writes: "With the advent of the Internet, it seemed clear that digitizing the codes and making them accessible over the World-Wide Web would benefit researchers, students, and professionals alike." The science page contains links to over fifty organizations' ethical codes, including the American Institute of Chemists, the American Physical Society, the Water Quality Association, etc.


Industrial Computer Codes  

NASA Technical Reports Server (NTRS)

This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

Shapiro, Wilbur



The Gli code  

PubMed Central

The Gli code hypothesis postulates that the three vertebrate Gli transcription factors act together in responding cells to integrate intercellular Hedgehog (Hh) and other signaling inputs, resulting in the regulation of tissue pattern, size and shape. Hh and other inputs are then just ways to modify the Gli code. Recent data confirm this idea and suggest that the Gli code regulates stemness and also tumor progression and metastatic growth, opening exciting possibilities for both regenerative medicine and novel anticancer therapies. PMID:17845852

Ruiz i Altaba, Ariel; Mas, Christophe; Stecca, Barbara



Benchmarking the democritus code  

Microsoft Academic Search

Summary form only given. The DEMOCRITUS code is a particle-based code for plasma-material interaction simulation. The code makes use of particle-in-cell (PIC) method to simulate each plasma species, the material, and their interaction. In this study, we concentrate on a dust particle immersed in a plasma. We start with the simplest case, in which the dust particle is not allowed

N. Arinaminpathy; C. Fichtl; G. Lapenta; G. L. Delzanno



Distributed Video Coding  

Microsoft Academic Search

Distributed coding is a new paradigm for video compression, based on Slepian and Wolf's and Wyner and Ziv's information-theoretic results from the 1970s. This paper reviews the recent development of practical distributed video coding schemes. Wyner-Ziv coding, i.e., lossy compression with receiver side information, enables low-complexity video encoding where the bulk of the computation is shifted to the decoder. Since

Bernd Girod; ANNE MARGOT AARON; Shantanu Rane; David Rebollo-Monedero



Scalable Hyperspectral Image Coding  

Microsoft Academic Search

Here we propose scalable Three-Dimensional Set Partitioned Embedded bloCK (3D-SPECK)–an embedded, block-based, wavelet transform coding algorithm of low complexity for hyperspectral image compression. Scalable 3D-SPECK supports both SNR and resolution progressive coding. After wavelet transform, 3D-SPECK treats each subband as a coding block. To generate SNR scalable bitstream, the stream is organized so that the same indexed bit planes are

Xiaoli Tang; William A. Pearlman



Building Codes and Standards  

NSDL National Science Digital Library

This brief document from David Cohan includes some information on building codes and standards. The purpose of building codes and standards is defined, and how they relate to energy and sustainability topics is also explored. This document would be useful for instructors looking for some notes on how to incorporate building codes and standards into their class work, or for students looking to learn more about the topic. This document may be downloaded in PDF file format.

Cohan, David


Resistor Color-Code  

NSDL National Science Digital Library

"Resistor manufactures implement the standard EIA color-code using three, four and five color bands to identify nominal resistor values. It is imperative that engineers and technicians know how to interpret the color markings on resistors in order to perform analysis and repairs on electronic products." On this page, visitors will find a key to the code for three, four, and five band resistors and exercises to check for understanding. A Resistor Color-Code chart can also be downloaded and printed from this site, as well as a Resistor Color-Code Converter.



Bar Code Labels  

NASA Technical Reports Server (NTRS)

American Bar Codes, Inc. developed special bar code labels for inventory control of space shuttle parts and other space system components. ABC labels are made in a company-developed anodizing aluminum process and consecutively marketed with bar code symbology and human readable numbers. They offer extreme abrasion resistance and indefinite resistance to ultraviolet radiation, capable of withstanding 700 degree temperatures without deterioration and up to 1400 degrees with special designs. They offer high resistance to salt spray, cleaning fluids and mild acids. ABC is now producing these bar code labels commercially or industrial customers who also need labels to resist harsh environments.



Grid-free tree-code simulations of the plasma-material interaction region  

NASA Astrophysics Data System (ADS)

A fully kinetic grid-free model based on a Barnes-Hut tree code is used to selfconsistently simulate a collisionless plasma bounded by two floating walls. The workhorse for simulating such plasma wall transition layers is currently the PIC method. However, the present grid-free formulation provides a powerful independent tool to test it [1] and to possibly extend particle simulations towards collisional regimes in a more internally consistent way. Here, we use the grid-free massively parallel Barnes-Hut tree-code PEPC - a well established tool for simulations of Laser-plasmas and astrophysical applications - to develop a 3D ab initio plasma target interaction model. With our approach an electrostatic sheath naturally builds up within the first couple of Debye lengths close to the wall rather than being imposed as a prescribed boundary condition. We verified the code using analytic results [2] as well as 1D PIC simulations [3]. The model was then used to investigate the influence of inclined magnetic fields on the plasma material interface. We used the code to study the correlation between the magnetic field angle and the angular distribution of incident particles.

Salmagne, C.; Reiter, D.; Gibbon, P.



Optimized interpolation filters for compatible pyramidal coding of TV and HDTV  

NASA Astrophysics Data System (ADS)

This paper deals with the question of optimizing the filters in the upsampling stage of a TV/HDTV compatible pyramidal coder. From a coding gain point of view, both the decimation and upsampling filters should be optimized. In the frame of compatible coding, not only the coding efficiency influences the choice of the decimation filter but also the compatible image quality. Therefore, assuming this filter has been fixed, we analyze the question of optimizing the upsampling filter in order to obtain the highest coding gain. This question is addressed for a mean squared error (MSE) criterion. In addition, assuming the base layer (TV) signal can be quantized, the influence on the quantization noise on the optimal interpolation filter is investigated and the problem is handled for the MSE criterion. As the statistical properties of pictures are required in the optimization, a model is then developed to compute these properties when there is motion. The model takes into account the processing of progressive sources and, concerning interlaced sequences, the independent processing of fields or the processing of merged fields. Results are then derived for the three types of processing.

Cuvelier, Laurent; Macq, Benoit M. M.; Maison, Benoit; Vandendorpe, Luc



The Sign Rule and Beyond: Boundary Effects, Flexibility, and Noise Correlations in Neural Population Codes  

PubMed Central

Over repeat presentations of the same stimulus, sensory neurons show variable responses. This “noise” is typically correlated between pairs of cells, and a question with rich history in neuroscience is how these noise correlations impact the population's ability to encode the stimulus. Here, we consider a very general setting for population coding, investigating how information varies as a function of noise correlations, with all other aspects of the problem – neural tuning curves, etc. – held fixed. This work yields unifying insights into the role of noise correlations. These are summarized in the form of theorems, and illustrated with numerical examples involving neurons with diverse tuning curves. Our main contributions are as follows. (1) We generalize previous results to prove a sign rule (SR) — if noise correlations between pairs of neurons have opposite signs vs. their signal correlations, then coding performance will improve compared to the independent case. This holds for three different metrics of coding performance, and for arbitrary tuning curves and levels of heterogeneity. This generality is true for our other results as well. (2) As also pointed out in the literature, the SR does not provide a necessary condition for good coding. We show that a diverse set of correlation structures can improve coding. Many of these violate the SR, as do experimentally observed correlations. There is structure to this diversity: we prove that the optimal correlation structures must lie on boundaries of the possible set of noise correlations. (3) We provide a novel set of necessary and sufficient conditions, under which the coding performance (in the presence of noise) will be as good as it would be if there were no noise present at all. PMID:24586128

Hu, Yu; Zylberberg, Joel; Shea-Brown, Eric



Axisymmetric generalized harmonic evolution code  

SciTech Connect

We describe the first axisymmetric numerical code based on the generalized harmonic formulation of the Einstein equations, which is regular at the axis. We test the code by investigating gravitational collapse of distributions of complex scalar field in a Kaluza-Klein spacetime. One of the key issues of the harmonic formulation is the choice of the gauge source functions, and we conclude that a damped-wave gauge is remarkably robust in this case. Our preliminary study indicates that evolution of regular initial data leads to formation both of black holes with spherical and cylindrical horizon topologies. Intriguingly, we find evidence that near threshold for black hole formation the number of outcomes proliferates. Specifically, the collapsing matter splits into individual pulses, two of which travel in the opposite directions along the compact dimension and one which is ejected radially from the axis. Depending on the initial conditions, a curvature singularity develops inside the pulses.

Sorkin, Evgeny [Max Planck Institute for Gravitational Physics (Albert Einstein Institute), Am Muehlenberg 1, D-14476, Golm (Germany)



Two-dimensional Fire codes  

Microsoft Academic Search

To improve the reliability of two-dimensional information, codes that can correct two-dimensional bursts (or spots) may be useful. In this paper a class of two-dimensional burst-correcting codes, called two-dimensional Fire codes, is proposed. The definition of these codes is a natural extension of that of the conventional Fire codes. The two-dimensional Fire code is a two-dimensional cyclic code designed for

H. Imai



Associations between children’s independent mobility and physical activity  

PubMed Central

Background Independent mobility describes the freedom of children to travel and play in public spaces without adult supervision. The potential benefits for children are significant such as social interactions with peers, spatial and traffic safety skills and increased physical activity. Yet, the health benefits of independent mobility, particularly on physical activity accumulation, are largely unexplored. This study aimed to investigate associations of children’s independent mobility with light, moderate-to-vigorous, and total physical activity accumulation. Methods In 2011 - 2012, 375 Australian children aged 8-13 years (62% girls) were recruited into a cross-sectional study. Children’s independent mobility (i.e. independent travel to school and non-school destinations, independent outdoor play) and socio-demographics were assessed through child and parent surveys. Physical activity intensity was measured objectively through an Actiheart monitor worn on four consecutive days. Associations between independent mobility and physical activity variables were analysed using generalized linear models, accounting for clustered sampling, Actiheart wear time, socio-demographics, and assessing interactions by sex. Results Independent travel (walking, cycling, public transport) to school and non-school destinations were not associated with light, moderate-to-vigorous and total physical activity. However, sub-analyses revealed a positive association between independent walking and cycling (excluding public transport) to school and total physical but only in boys (b?=?36.03, p?independent outdoor play (three or more days per week) was positively associated with light and total physical activity (b?=?29.76, p?independent outdoor play and moderate-to-vigorous physical activity. When assessing differences by sex, the observed significant associations of independent outdoor play with light and total physical activity remained in girls but not in boys. All other associations showed no significant differences by sex. Conclusions Independent outdoor play may boost children’s daily physical activity levels, predominantly at light intensity. Hence, facilitating independent outdoor play could be a viable intervention strategy to enhance physical activity in children, particularly in girls. Associations between independent travel and physical activity are inconsistent overall and require further investigation. PMID:24476363



Independent bilateral primary bronchial carcinomas  

PubMed Central

Independent bilateral primary bronchial carcinomas are not common. Since Beyreuther's description in 1924, 16 well-documented cases of independent primary bronchial carcinomas of different histology have been described. From 1965 to 1970, eight cases were seen at the London Chest Hospital. In order to make the diagnosis of a second primary bronchial carcinoma, each tumour should be malignant and neither should be a metastasis from the other. To meet this last criterion, the histopathological features of the two tumours must be different. Many cases have been described in the literature as double primary bronchial carcinomas where the second primary had the same histological features as the first. Images PMID:4327711

Chaudhuri, M. Ray



RFQ simulation code  

SciTech Connect

We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs.

Lysenko, W.P.



QR code security  

Microsoft Academic Search

This paper examines QR Codes and how they can be used to attack both human interaction and automated systems. As the encoded information is intended to be machine readable only, a human cannot distinguish between a valid and a maliciously manipulated QR code. While humans might fall for phishing attacks, automated readers are most likely vulnerable to SQL injections and

Peter Kieseberg; Manuel Leithner; Martin Mulazzani; Lindsay Munroe; Sebastian Schrittwieser; Mayank Sinha; Edgar Weippl



Lichenase and coding sequences  


The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

Li, Xin-Liang (Athens, GA); Ljungdahl, Lars G. (Athens, GA); Chen, Huizhong (Lawrenceville, GA)



Code of Ethics  

ERIC Educational Resources Information Center

The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

Division for Early Childhood, Council for Exceptional Children, 2009



Computerized mega code recording.  


A system has been developed to facilitate recording of advanced cardiac life support mega code testing scenarios. By scanning a paper "keyboard" using a bar code wand attached to a portable microcomputer, the person assigned to record the scenario can easily generate an accurate, complete, timed, and typewritten record of the given situations and the obtained responses. PMID:3354937

Burt, T W; Bock, H C



Prostate Surgery Codes

Prostate C619 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Do not code an orchiectomy in this field. For prostate primaries, orchiectomies are coded in the data item “Hematologic Transplant and


Did You "Code"  

ERIC Educational Resources Information Center

The problem of solving mathematical equations can be quite tough for some students hence they face a great difficulty when applying ideas to the actual process. Students in algebra classes are taught coding in which they write down what they will need to do to solve the equation and this coding makes the students more adept at solving equations…

Clausen, Mary C.



Hadamard transform image coding  

Microsoft Academic Search

The introduction of the fast Fourier transform algorithm has led to the development of the Fourier transform image coding technique whereby the two-dimensional Fourier transform of an image is transmitted over a channel rather than the image itself. This devlopement has further led to a related image coding technique in which an image is transformed by a Hadamard matrix operator.

W. K. Pratt; J. Kane; H. C. Andrews



Corpus Uteri Surgery Codes

Corpus Uteri C540–C559 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) [SEER Note: Do not code dilation and curettage (D&C) as Surgery of Primary Site for invasive cancers] Codes 00 None; no surgery


Anus Surgery Codes

Anus C210–C218 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) [SEER Note: Do not code infrared coagulation as treatment.] Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor


Cervix Uteri Surgery Codes

Cervi x Uteri C530–C539 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) [SEER Note: Do not code dilation and curettage (D&C) as Surgery of Primary Site for invasive cancers] Codes 00 None; no surgery


Experimental measurement-device-independent entanglement detection.  


Entanglement is one of the most puzzling features of quantum theory and of great importance for the new field of quantum information. The determination whether a given state is entangled or not is one of the most challenging open problems of the field. Here we report on the experimental demonstration of measurement-device-independent (MDI) entanglement detection using witness method for general two qubits photon polarization systems. In the MDI settings, there is no requirement to assume perfect implementations or neither to trust the measurement devices. This experimental demonstration can be generalized for the investigation of properties of quantum systems and for the realization of cryptography and communication protocols. PMID:25649664

Nawareg, Mohamed; Muhammad, Sadiq; Amselem, Elias; Bourennane, Mohamed



Experimental Measurement-Device-Independent Entanglement Detection  

PubMed Central

Entanglement is one of the most puzzling features of quantum theory and of great importance for the new field of quantum information. The determination whether a given state is entangled or not is one of the most challenging open problems of the field. Here we report on the experimental demonstration of measurement-device-independent (MDI) entanglement detection using witness method for general two qubits photon polarization systems. In the MDI settings, there is no requirement to assume perfect implementations or neither to trust the measurement devices. This experimental demonstration can be generalized for the investigation of properties of quantum systems and for the realization of cryptography and communication protocols. PMID:25649664

Nawareg, Mohamed; Muhammad, Sadiq; Amselem, Elias; Bourennane, Mohamed



New quantum MDS codes derived from constacyclic codes  

NASA Astrophysics Data System (ADS)

Quantum maximal-distance-separable (MDS) codes form an important class of quantum codes. It is very hard to construct quantum MDS codes with relatively large minimum distance. In this paper, based on classical constacyclic codes, we construct two classes of quantum MDS codes with parameters $$[[\\lambda(q-1),\\lambda(q-1)-2d+2,d

Wang, Liqi; Zhu, Shixin



Evolving genetic code  

PubMed Central

In 1985, we reported that a bacterium, Mycoplasma capricolum, used a deviant genetic code, namely UGA, a “universal” stop codon, was read as tryptophan. This finding, together with the deviant nuclear genetic codes in not a few organisms and a number of mitochondria, shows that the genetic code is not universal, and is in a state of evolution. To account for the changes in codon meanings, we proposed the codon capture theory stating that all the code changes are non-disruptive without accompanied changes of amino acid sequences of proteins. Supporting evidence for the theory is presented in this review. A possible evolutionary process from the ancient to the present-day genetic code is also discussed. PMID:18941287

OHAMA, Takeshi; INAGAKI, Yuji; BESSHO, Yoshitaka; OSAWA, Syozo



Serial Code Optimization  

NSDL National Science Digital Library

This guide presents the main features of serial optimisation for computationally intensive codes with a focus on the HECToR computing resources. From a user point of view, two main avenues can be followed when trying to optimise an application. One type of optimisations DO NOT involve modifying the source code (modification may not be desirable); optimisation consists of searching for the best compiler, set of flags and libraries. Another type of optimisations DO involve modifying the source code; in the first instance the programmer must evaluate if a new algorithm is necessary, followed by writing or rewriting optimised code. According to the these choices this guide presents optimisation as a problem of compiler and library selection, followed by a presentation of the key factors that must be considered when writing numerically intensive code.


Astrophysics Source Code Library  

NASA Astrophysics Data System (ADS)

The Astrophysics Source Code Library (ASCL), founded in 1999, is a free on-line registry for source codes of interest to astronomers and astrophysicists. The library is housed on the discussion forum for Astronomy Picture of the Day (APOD) and can be accessed at The ASCL has a comprehensive listing that covers a significant number of the astrophysics source codes used to generate results published in or submitted to refereed journals and continues to grow. The ASCL currently has entries for over 500 codes; its records are citable and are indexed by ADS. The editors of the ASCL and members of its Advisory Committee were on hand at a demonstration table in the ADASS poster room to present the ASCL, accept code submissions, show how the ASCL is starting to be used by the astrophysics community, and take questions on and suggestions for improving the resource.

Allen, A.; DuPrie, K.; Berriman, B.; Hanisch, R. J.; Mink, J.; Teuben, P. J.



The MELCOR code: History, status, and plans  

SciTech Connect

In 1982, the U.S. Nuclear Regulatory Commission sponsored development of the MELCOR code at Sandia National Laboratories in response to the need for a calculational tool capable of performing integrated analyses of severe accidents in nuclear reactors. The intention was that all relevant phenomena would be modeled parametrically, with the level of detail for each depending both on its importance in an accident and on the consensus in the safety community concerning physical understanding of the phenomenon. MELCOR was to be `integrated` in the sense that various feedback effects would be modeled, including, but not limited to, the effects of containment conditions on in-vessel accident progression and the relocation of heat sources with the transport of radionuclides. The existing calculational tool [Source Term Code Package (STCP)] required manual transfer of data from one code to another. MELCOR eliminated this procedure and the potential error introduced. The consequences analysis was to be performed by a separate computer tool. MELCOR was peer reviewed in 1991. Most subsequent model development and improvement activities have been based on comments received in that review. All new models have been independently peer reviewed prior to incorporation into MELCOR. One of the peer review suggestions was the need for additional code assessment. This suggestion resulted in the formation of the MELCOR Cooperative Assessment Program (MCAP) as an ongoing activity to assess the code against experiments and in performing plant analyses.

Ridgely, J.N. [Nuclear Regulatory Commission, Washington, DC (United States); Cole, R.K. Jr.; Gauntt, R.O. [Sandia National Laboratories, Albuquerque, NM (United States)




E-print Network

PALM DESERT ENERGY INDEPENDENCE PROGRAM SUMMARY OF LOAN PROCESS Proiect Scopina The first step by the property owner. Applications and instructions are available online at the Palm Desert web site that the applicant is the property owner through a City of Palm Desert contract with a nationally-recognized title

Kammen, Daniel M.


The scale independence of evolution  

Microsoft Academic Search

SUMMARY In this paper, I argue that the ultimate causes of morphological, and hence developmental, evolution are scale independent. In other words, micro- and macroevolu- tionary patterns show fundamental similarities and therefore are most simply explained as being caused by the same kinds of evolutionary forces. I begin by examining the evolu- tion of single lineages and argue that dynamics

Armand M. Leroi



Towards speaker independent continuous speechreading  

Microsoft Academic Search

Abstract: This paper describes recent speechreading experimentsfor a speaker independent continuous digitrecognition task. Visual feature extraction is performedbyalip tracker which recovers informationabout the lip shape and information about the greylevelintensity around the mouth. These features areused to train visual word models using continuousdensity HMMs. Results show that the method generaliseswell to new speakers and that the recognitionrate is highly variable

Juergen Luettin



IEAB Independent Economic Analysis Board  

E-print Network

of Instream Water Supply Components of the Salmon Creek Project Independent Economic Analysis Board Northwest district to continue water delivery to its users. The Proposed Project includes (1) improved water control, (2) on-farm efficiency to reduce irrigation water use, (3) a new pump station on the Okanogan River


10 Questions about Independent Reading  

ERIC Educational Resources Information Center

Teachers know that establishing a robust independent reading program takes more than giving kids a little quiet time after lunch. But how do they set up a program that will maximize their students' gains? Teachers have to know their students' reading levels inside and out, help them find just-right books, and continue to guide them during…

Truby, Dana



An Eye-Tracking Study of How Color Coding Affects Multimedia Learning  

ERIC Educational Resources Information Center

Color coding has been proposed to promote more effective learning. However, insufficient evidence currently exists to show how color coding leads to better learning. The goal of this study was to investigate the underlying cause of the color coding effect by utilizing eye movement data. Fifty-two participants studied either a color-coded or…

Ozcelik, Erol; Karakus, Turkan; Kursun, Engin; Cagiltay, Kursat



Rateless Coding for Gaussian Channels  

E-print Network

A rateless code-i.e., a rate-compatible family of codes-has the property that codewords of the higher rate codes are prefixes of those of the lower rate ones. A perfect family of such codes is one in which each of the codes ...

Wornell, Gregory W.


EMI Independent Study Program Transcript Request Form  

E-print Network

EMI Independent Study Program Transcript Request Form A transcript of your Independent Study course Mail your request to: National Emergency Training Center EMI Independent Study Program OR Fax to: (301

Shyu, Mei-Ling


Independence of the unimodal tuning of firing rate from theta phase precession in hippocampal place cells.  


There are two prominent features for place cells in rat hippocampus. The firing rate remarkably increases when rat enters the cell's place field and reaches a maximum around the center of place field, and it decreases when the animal approaches the end of the place field. Simultaneously the spikes gradually and monotonically advance to earlier phase relative to hippocampal theta rhythm as the rat traverses along the cell's place field, known as temporal coding. In this paper, we investigate whether two main characteristics of place cell firing are independent or not by mainly focusing on the generation mechanism of the unimodal tuning of firing rate by using a reduced CA1 two-compartment neuron model. Based on recent evidences, we hypothesize that the coupling of dendritic with the somatic compartment is not constant but dynamically regulated as the animal moves further along the place field, in contrast to previous two-compartment modeling. Simulations show that the regulable coupling is critically responsible for the generation of unimodal firing rate profile in place cells, independent of phase precession. Predictions of our model accord well with recent observations like occurrence of phase precession with very low as well as high firing rate (Huxter et al. Nature 425:828-832, 2003) and persistency of phase precession after transient silence of hippocampus activity (Zugaro et al. Nat Neurosci 8:67-71, 2005. PMID:20041262

Wu, Zhihua; Yamaguchi, Yoko



Pyramid image codes  

NASA Technical Reports Server (NTRS)

All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

Watson, Andrew B.



Embedded foveation image coding.  


The human visual system (HVS) is highly space-variant in sampling, coding, processing, and understanding. The spatial resolution of the HVS is highest around the point of fixation (foveation point) and decreases rapidly with increasing eccentricity. By taking advantage of this fact, it is possible to remove considerable high-frequency information redundancy from the peripheral regions and still reconstruct a perceptually good quality image. Great success has been obtained previously by a class of embedded wavelet image coding algorithms, such as the embedded zerotree wavelet (EZW) and the set partitioning in hierarchical trees (SPIHT) algorithms. Embedded wavelet coding not only provides very good compression performance, but also has the property that the bitstream can be truncated at any point and still be decoded to recreate a reasonably good quality image. In this paper, we propose an embedded foveation image coding (EFIC) algorithm, which orders the encoded bitstream to optimize foveated visual quality at arbitrary bit-rates. A foveation-based image quality metric, namely, foveated wavelet image quality index (FWQI), plays an important role in the EFIC system. We also developed a modified SPIHT algorithm to improve the coding efficiency. Experiments show that EFIC integrates foveation filtering with foveated image coding and demonstrates very good coding performance and scalability in terms of foveated image quality measurement. PMID:18255485

Wang, Z; Bovik, A C



Quantum convolutional codes derived from constacyclic codes  

NASA Astrophysics Data System (ADS)

In this paper, three families of quantum convolutional codes are constructed. The first one and the second one can be regarded as a generalization of Theorems 3, 4, 7 and 8 [J. Chen, J. Li, F. Yang and Y. Huang, Int. J. Theor. Phys., doi:10.1007/s10773-014-2214-6 (2014)], in the sense that we drop the constraint q ? 1 (mod 4). Furthermore, the second one and the third one attain the quantum generalized Singleton bound.

Yan, Tingsu; Huang, Xinmei; Tang, Yuansheng



Banner Index Codes The Index code is a data-entry shortcut for the Fund code, Org code, and Program code in Banner  

E-print Network

fund number, your index code is the *same* as your fund number 1 . For older research projects, your index code is the same as the 6-digit ORS research project (the old FRS) number beginning with a 4Banner Index Codes The Index code is a data-entry shortcut for the Fund code, Org code, and Program


DECKARD: Scalable and Accurate Tree-based Detection of Code Clones Lingxiao Jiang Ghassan Misherghi Zhendong Su  

E-print Network

Cachan, France Abstract Detecting code clones has many software engineering applications. Existing approaches either do not scale to large code bases or are not robust against minor code modi is both scal- able and accurate. It is also language independent, appli- cable to any language

Su, Zhendong


Perceptually-Based Adaptive JPEG Coding  

NASA Technical Reports Server (NTRS)

An extension to the JPEG standard (ISO/IEC DIS 10918-3) allows spatial adaptive coding of still images. As with baseline JPEG coding, one quantization matrix applies to an entire image channel, but in addition the user may specify a multiplier for each 8 x 8 block, which multiplies the quantization matrix, yielding the new matrix for the block. MPEG 1 and 2 use much the same scheme, except there the multiplier changes only on macroblock boundaries. We propose a method for perceptual optimization of the set of multipliers. We compute the perceptual error for each block based upon DCT quantization error adjusted according to contrast sensitivity, light adaptation, and contrast masking, and pick the set of multipliers which yield maximally flat perceptual error over the blocks of the image. We investigate the bitrate savings due to this adaptive coding scheme and the relative importance of the different sorts of masking on adaptive coding.

Watson, Andrew B.; Rosenholtz, Ruth; Null, Cynthia H. (Technical Monitor)



Three-Dimensional Combined Diversity Coding and Error Control Coding: Code Design and  

E-print Network

-output communications, a general term for complex coding is signal space diversity [?]; 2) for two- dimensional channelsThree-Dimensional Combined Diversity Coding and Error Control Coding: Code Design and Diversity Lab of Information Coding and Transmission Southwest Jiaotong University, Chengdu 610031, China E

Blostein, Steven D.


Material-independent and size-independent tractor beams for dipole objects.  


A Bessel beam without an axial gradient can exert a pulling force on an object [A. Novitsky, C.?W. Qiu, and H. Wang, Phys. Rev. Lett. 107, 203601 (2011)]. However, it cannot be called a "tractor beam" per se, as long as the light pulling effect is ultrasensitive to the object's material and size, a perturbation of which will make the optical traction go away. In this Letter, we investigate and report on the universality for a Bessel beam to be either a material-independent or size-independent optical tractor beam within the dipolar regime. Moreover, a general condition for a nonparaxial laser to be simultaneously a material- and size-independent tractor beam is proposed. These universal pulling effects and conditions are discussed in association with insight on modified far-field scattering, scattering resonances, and induced polarizabilities. Interestingly, we find that the acoustic pulling force exhibits only size independence, owing to the acoustic scattering theory in contrast to the light scattering counterpart. The findings pave the way for the realistic engineering and application of universal tractor beams pulling a wide variety of objects. PMID:23030161

Novitsky, Andrey; Qiu, Cheng-Wei; Lavrinenko, Andrei



Material-Independent and Size-Independent Tractor Beams for Dipole Objects  

NASA Astrophysics Data System (ADS)

A Bessel beam without an axial gradient can exert a pulling force on an object [A. Novitsky, C. W. Qiu, and H. Wang, Phys. Rev. Lett. 107, 203601 (2011)PRLTAO0031-900710.1103/PhysRevLett.107.203601]. However, it cannot be called a “tractor beam” per se, as long as the light pulling effect is ultrasensitive to the object’s material and size, a perturbation of which will make the optical traction go away. In this Letter, we investigate and report on the universality for a Bessel beam to be either a material-independent or size-independent optical tractor beam within the dipolar regime. Moreover, a general condition for a nonparaxial laser to be simultaneously a material- and size-independent tractor beam is proposed. These universal pulling effects and conditions are discussed in association with insight on modified far-field scattering, scattering resonances, and induced polarizabilities. Interestingly, we find that the acoustic pulling force exhibits only size independence, owing to the acoustic scattering theory in contrast to the light scattering counterpart. The findings pave the way for the realistic engineering and application of universal tractor beams pulling a wide variety of objects.

Novitsky, Andrey; Qiu, Cheng-Wei; Lavrinenko, Andrei



Two-Dimensional Modulation Codes  

Microsoft Academic Search

A new class of run-length-limited codes in introduced. These codes are called two-dimensional or multitrack modulation codes. Two-dimensional modulation codes provide substantial data storage density increases for multitrack recording systems by operating on multiple tracks in parallel. Procedures for computing the capacity of these new codes are given along with fast algorithms for implementing these procedures. Examples of two-dimensional codes

Michael W. Marcellin; Harold J. Weber



What's coming in 2012 codes  

E-print Network

The 2012 IECC America?s Model Building Energy Code November 9, 2011 Presentation to Clean Air Through Energy Efficiency Conference Dallas, TX Eric Lacey Responsible Energy Codes Alliance Responsible Energy Codes Alliance ? A broad....S. EPA) RECA Resources Available ? 2012 IECC RECA Compliance Guides ? Available on ? Hard copies available directly from RECA 202-339-6366 Eric Lacey, Chairman Responsible Energy Codes Alliance 202...

Lacey, E



Rectosigmoid Junction Coding Guidelines

Coding Guidelines Rectosigmoid Junction C199 Primary Site A tumor is classified as rectosigmoid when differentiation between rectum and sigmoid is not possible. A tumor is classified as rectal if • lower margin lies less than 16 cm from the anal


Quantum error control codes  

E-print Network

, the classical Hamming bound can be stated as tX i=0 n i ¶ (q ¡1)i • qn¡k; (2.9) 16 where t = b(d¡1)=2c. Codes that attain Hamming bound with equality are classifled as perfect codes. Let every codeword be represented by a sphere of radius t. The interpretation..., the classical Hamming bound can be stated as tX i=0 n i ¶ (q ¡1)i • qn¡k; (2.9) 16 where t = b(d¡1)=2c. Codes that attain Hamming bound with equality are classifled as perfect codes. Let every codeword be represented by a sphere of radius t. The interpretation...

Abdelhamid Awad Aly Ahmed, Sala



Steps to Independent Living Series.  

ERIC Educational Resources Information Center

This set of six activity books and a teacher's guide is designed to help students from eighth grade to adulthood with special needs to learn independent living skills. The activity books have a reading level of 2.5 and address: (1) "How to Get Well When You're Sick or Hurt," including how to take a temperature, see a doctor, and use medicines…

Lobb, Nancy


An experimental investigation of clocking effects on turbine aerodynamics using a modern 3-D one and one-half stage high pressure turbine for code verification and flow model development  

NASA Astrophysics Data System (ADS)

This research uses a modern 1 and 1/2 stage high-pressure (HP) turbine operating at the proper design corrected speed, pressure ratio, and gas to metal temperature ratio to generate a detailed data set containing aerodynamic, heat-transfer and aero-performance information. The data was generated using the Ohio State University Gas Turbine Laboratory Turbine Test Facility (TTF), which is a short-duration shock tunnel facility. The research program utilizes an uncooled turbine stage for which all three airfoils are heavily instrumented at multiple spans and on the HPV and LPV endwalls and HPB platform and tips. Heat-flux and pressure data are obtained using the traditional shock-tube and blowdown facility operational modes. Detailed examination show that the aerodynamic (pressure) data obtained in the blowdown mode is the same as obtained in the shock-tube mode when the corrected conditions are matched. Various experimental conditions and configurations were performed, including LPV clocking positions, off-design corrected speed conditions, pressure ratio changes, and Reynolds number changes. The main research for this dissertation is concentrated on the LPV clocking experiments, where the LPV was clocked relative to the HPV at several different passage locations and at different Reynolds numbers. Various methods were used to evaluate the effect of clocking on both the aeroperformance (efficiency) and aerodynamics (pressure loading) on the LPV, including time-resolved measurements, time-averaged measurements and stage performance measurements. A general improvement in overall efficiency of approximately 2% is demonstrated and could be observed using a variety of independent methods. Maximum efficiency is obtained when the time-average pressures are highest on the LPV, and the time-resolved data both in the time domain and frequency domain show the least amount of variation. The gain in aeroperformance is obtained by integrating over the entire airfoil as the three-dimensional effects on the LPV surface are significant.

Haldeman, Charles Waldo, IV



Subband coding of images  

Microsoft Academic Search

Subband coding has become quite popular for the source encoding of speech. This paper presents a simple yet efficient extension of this concept to the source coding of images. We specify the constraints for a set of two-dimensional quadrature mirror filters (QMF's) for a particular frequency-domain partition, and show that these constraints are satisfied by a separable combination of one-dimensional





Microsoft Academic Search

The LORASR code is specialized on the beam dynamics design of Separate Function DTL's based on the 'Combined 0 Degree Structure (KONUS)' beam dynamics concept. The code has been used for the beam dynamics design of several linacs which are in operation (GSI-HLI, GSI-HSI, CERN Linac 3, TRIUMF ISAC-I) or are scheduled to start beam operation in the near future

R. Tiede; G. Clemente; H. Podlech; U. Ratzinger; A. Sauer; S. Minaev



Seals Flow Code Development  

NASA Technical Reports Server (NTRS)

In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.



Soil, An Environmental Investigation.  

ERIC Educational Resources Information Center

This environmental unit is one of a series designed for integration within an existing curriculum. The unit is self-contained and requires minimal teacher preparation. The philosophy of the series is based on an experience-oriented process that encourages self-paced independent student work. This particular unit investigates soil in relation to…

National Wildlife Federation, Washington, DC.


Investigating Mendelian Genetics  

NSDL National Science Digital Library

The genetics investigations in this 15-page booklet were designed for students to develop understanding of the following biological concepts and skills:- MendelÂs law of segregation and law of independent assortment- Inheritance of 2 traits- How genotypes influence phenotypes- Scientific inquiry, including interpretation of evidence

Program, The W.


Savannah River experience using a Cause Coding Tree to identify the root cause of an incident  

SciTech Connect

Incidents (or near misses) provide important information about plant performance and ways to improve that performance. Any particular incident may have several ''root causes'' that need to be addressed to prevent recurrence of the incident and thereby improve the safety of the plant. Also, by reviewing a large number of these incidents, one can identify trends in the root causes and generic concerns. A method has been developed at Savannah River Plant to systematically evaluate incidents, identify their root causes, record these root causes, and analyze the trends of these causes. By providing a systematic method to identify correctable root causes, the system helps the incident investigator to ask the right questions during the investigation. It also provides the independent safety analysis group and management with statistics that indicate existing and developing trouble sports. This paper describes the Savannah River Plant (SRP) Cause Coding Tree, and the differences between the SRP Tree and other systems used to analyze incidents. 2 refs., 14 figs.

Paradies, M.W.; Busch, D.A.



Priority subband image coding  

NASA Astrophysics Data System (ADS)

In this paper a new approach based on the subband decomposition (SBD) for compression and progressive transmission of images is presented. The novelty of this approach is that the transform coefficients of all image subbands are coded and transmitted in absolute magnitude order. The ordered-by-magnitude transmission is accomplished by using partition priority coding (PPC) a new source coding method that allows the simultaneous transmission of the coefficients of a data set in an ordered fashion along with position information. Using this approach, coding and transmission are adaptable to the characteristics of each individual image and therefore very efficient. Another advantage of this approach is its high progression effectiveness. Since the largest transform coefficients that capture the most important characteristics of images are coded and transmitted first this method is well suited for progressive image transmission (PIT). Further compression of the image data is achieved by adaptive multiple subsources distribution technique based on arithmetic coding. Experiments are presented where the new SBD approach was tested. It is shown that it compares favorably with previously reported discrete cosine transform (DCT) and subband image codecs and is also very effective for progressive image transmission.

Yu, Steve S.; Galatsanos, Nikolas P.; Huang, Yunming G.



Architecture independent performance characterization andbenchmarking for scientific applications  

SciTech Connect

A simple, tunable, synthetic benchmark with a performance directly related to applications would be of great benefit to the scientific computing community. In this paper, we present a novel approach to develop such a benchmark. The initial focus of this project is on data access performance of scientific applications. First a hardware independent characterization of code performance in terms of address streams is developed. The parameters chosen to characterize a single address stream are related to regularity, size, spatial, and temporal locality. These parameters are then used to implement a synthetic benchmark program that mimics the performance of a corresponding code. To test the validity of our approach we performed experiments using five test kernels on six different platforms. The performance of most of our test kernels can be approximated by a single synthetic address stream. However in some cases overlapping two address streams is necessary to achieve a good approximation.

Strohmaier, Erich; Shan, Hongzhang



Punctured Elias Codes for variablelength coding of the integers  

E-print Network

Punctured Elias Codes for variable­length coding of the integers Peter Fenwick Technical Report 137 of the Elias g code which is shown to be better than other codes for some distributions. #12; 1. Compact and are instantaneous or nearly so. Following Elias[3], we first introduce two preliminary representations which

Fenwick, Peter


Falcon Codes: Fast, Authenticated LT Codes Cornell Tech  

E-print Network

Falcon Codes: Fast, Authenticated LT Codes Ari Juels Cornell Tech New York City NY, USA Abstract--In this paper, we introduce Falcon codes, a class of authenticated error correcting codes) they allow for very efficient encoding and decoding times, even linear in the message length. We study Falcon



E-print Network


Qiu, Weigang


An integrated framework for adaptive subband image coding  

Microsoft Academic Search

Previous work on filter banks and related expansions has revealed an interesting insight: different filter bank trees can be regarded as different ways of constructing orthonormal bases for linear signal expansion. In particular, fast algorithms for finding best bases in an operational rate-distortion (R\\/D) sense have been successfully used in image coding. Independently of this work, other research has also

Vladimir Pavlovic; Pierre Moulin; Kannan Ramchandran



Space-time block codes for noncoherent CPFSK  

Microsoft Academic Search

In this paper the problem of unitary rate space-time block coding for multiple-input multiple-output communication systems employing continuous phase frequency shift keying is investigated. First, one-shot noncoherent detection is analysed in a maximum likelihood perspective; then, design criteria for optimal space-time codes are proposed and some error bounds for the devised coding schemes are derived. Numerical results evidence that the

Fabrizio Pancaldi; Giorgio M. Vitetta



Dynamic fuzzy control of genetic algorithm parameter coding.  


An algorithm for adaptively controlling genetic algorithm parameter (GAP) coding using fuzzy rules is presented. The fuzzy GAP coding algorithm is compared to the dynamic parameter encoding scheme proposed by Schraudolph and Belew. The performance of the algorithm on a hydraulic brake emulator parameter identification problem is investigated. Fuzzy GAP coding control is shown to dramatically increase the rate of convergence and accuracy of genetic algorithms. PMID:18252316

Streifel, R J; Marks, R J; Reed, R; Choi, J J; Healy, M



Deterministic and unambiguous dense coding  

SciTech Connect

Optimal dense coding using a partially-entangled pure state of Schmidt rank D and a noiseless quantum channel of dimension D is studied both in the deterministic case where at most L{sub d} messages can be transmitted with perfect fidelity, and in the unambiguous case where when the protocol succeeds (probability {tau}{sub x}) Bob knows for sure that Alice sent message x, and when it fails (probability 1-{tau}{sub x}) he knows it has failed. Alice is allowed any single-shot (one use) encoding procedure, and Bob any single-shot measurement. For D{<=}D a bound is obtained for L{sub d} in terms of the largest Schmidt coefficient of the entangled state, and is compared with published results by Mozes et al. [Phys. Rev. A71, 012311 (2005)]. For D>D it is shown that L{sub d} is strictly less than D{sup 2} unless D is an integer multiple of D, in which case uniform (maximal) entanglement is not needed to achieve the optimal protocol. The unambiguous case is studied for D{<=}D, assuming {tau}{sub x}>0 for a set of DD messages, and a bound is obtained for the average <1/{tau}>. A bound on the average <{tau}> requires an additional assumption of encoding by isometries (unitaries when D=D) that are orthogonal for different messages. Both bounds are saturated when {tau}{sub x} is a constant independent of x, by a protocol based on one-shot entanglement concentration. For D>D it is shown that (at least) D{sup 2} messages can be sent unambiguously. Whether unitary (isometric) encoding suffices for optimal protocols remains a major unanswered question, both for our work and for previous studies of dense coding using partially-entangled states, including noisy (mixed) states.

Wu Shengjun [Physics Department, Carnegie-Mellon University, Pittsburgh, Pennsylvania 15213 (United States); Hefei National Laboratory for Physical Science at the Microscale, University of Science and Technology of China (China); Cohen, Scott M. [Physics Department, Carnegie-Mellon University, Pittsburgh, Pennsylvania 15213 (United States); Physics Department, Duquesne University, Pittsburgh, Pennsylvania 15282 (United States); Sun Yuqing; Griffiths, Robert B. [Physics Department, Carnegie-Mellon University, Pittsburgh, Pennsylvania 15213 (United States)



High-fidelity coding with correlated neurons.  


Positive correlations in the activity of neurons are widely observed in the brain. Previous studies have shown these correlations to be detrimental to the fidelity of population codes, or at best marginally favorable compared to independent codes. Here, we show that positive correlations can enhance coding performance by astronomical factors. Specifically, the probability of discrimination error can be suppressed by many orders of magnitude. Likewise, the number of stimuli encoded--the capacity--can be enhanced more than tenfold. These effects do not necessitate unrealistic correlation values, and can occur for populations with a few tens of neurons. We further show that both effects benefit from heterogeneity commonly seen in population activity. Error suppression and capacity enhancement rest upon a pattern of correlation. Tuning of one or several effective parameters can yield a limit of perfect coding: the corresponding pattern of positive correlation leads to a 'lock-in' of response probabilities that eliminates variability in the subspace relevant for stimulus discrimination. We discuss the nature of this pattern and we suggest experimental tests to identify it. PMID:25412463

da Silveira, Rava Azeredo; Berry, Michael J



High-Fidelity Coding with Correlated Neurons  

PubMed Central

Positive correlations in the activity of neurons are widely observed in the brain. Previous studies have shown these correlations to be detrimental to the fidelity of population codes, or at best marginally favorable compared to independent codes. Here, we show that positive correlations can enhance coding performance by astronomical factors. Specifically, the probability of discrimination error can be suppressed by many orders of magnitude. Likewise, the number of stimuli encoded—the capacity—can be enhanced more than tenfold. These effects do not necessitate unrealistic correlation values, and can occur for populations with a few tens of neurons. We further show that both effects benefit from heterogeneity commonly seen in population activity. Error suppression and capacity enhancement rest upon a pattern of correlation. Tuning of one or several effective parameters can yield a limit of perfect coding: the corresponding pattern of positive correlation leads to a ‘lock-in’ of response probabilities that eliminates variability in the subspace relevant for stimulus discrimination. We discuss the nature of this pattern and we suggest experimental tests to identify it. PMID:25412463

da Silveira, Rava Azeredo; Berry, Michael J.



TACO: a finite element heat transfer code  

SciTech Connect

TACO is a two-dimensional implicit finite element code for heat transfer analysis. It can perform both linear and nonlinear analyses and can be used to solve either transient or steady state problems. Either plane or axisymmetric geometries can be analyzed. TACO has the capability to handle time or temperature dependent material properties and materials may be either isotropic or orthotropic. A variety of time and temperature dependent loadings and boundary conditions are available including temperature, flux, convection, and radiation boundary conditions and internal heat generation. Additionally, TACO has some specialized features such as internal surface conditions (e.g., contact resistance), bulk nodes, enclosure radiation with view factor calculations, and chemical reactive kinetics. A user subprogram feature allows for any type of functional representation of any independent variable. A bandwidth and profile minimization option is also available in the code. Graphical representation of data generated by TACO is provided by a companion post-processor named POSTACO. The theory on which TACO is based is outlined, the capabilities of the code are explained, the input data required to perform an analysis with TACO are described. Some simple examples are provided to illustrate the use of the code.

Mason, W.E. Jr.



Performance improvement of spectral amplitude coding-optical code division multiple access systems using NAND detection with enhanced double weight code  

NASA Astrophysics Data System (ADS)

The bit-error rate (BER) performance of the spectral amplitude coding-optical code division multiple access (SACOCDMA) system has been investigated by using NAND subtraction detection technique with enhanced double weight (EDW) code. The EDW code is the enhanced version of double weight (DW) code family where the code weight is any odd number and greater than one with ideal cross-correlation. In order to evaluate the performance of the system, we used mathematical analysis extensively along with the simulation experiment. The evaluation results obtained using the NAND subtraction detection technique was compared with those obtained using the complementary detection technique for the same number of active users. The comparison results revealed that the BER performance of the system using NAND subtraction detection technique has greatly been improved as compared to the complementary technique.

Ahmed, Nasim; Aljunid, Syed Alwee; Ahmad, R. Badlishah; Fadhil, Hilal A.; Rashid, Mohd Abdur



Independent Operators at Different Dimension  

E-print Network

To apply lattice QCD in the calculation of glueball spectrum it is needed firstly to know associated operators acting on vacuum. We show how to find all the independent representations and operators, of group $SO(3)^{PC}$ at different dimension, since the work is not trivial. Then, we decompose these representation into irreducible representation of $O^{PC}$ group, which are listed in the note. At last we argue that $f_J(2220)$ and $g_T$ states can not be tensor glueball simultaneously.

Yin Chen; Daqing Liu; Yubin Liu; Jimin Wu



Independent Component Analysis of Textures  

NASA Technical Reports Server (NTRS)

A common method for texture representation is to use the marginal probability densities over the outputs of a set of multi-orientation, multi-scale filters as a description of the texture. We propose a technique, based on Independent Components Analysis, for choosing the set of filters that yield the most informative marginals, meaning that the product over the marginals most closely approximates the joint probability density function of the filter outputs. The algorithm is implemented using a steerable filter space. Experiments involving both texture classification and synthesis show that compared to Principal Components Analysis, ICA provides superior performance for modeling of natural and synthetic textures.

Manduchi, Roberto; Portilla, Javier



Prioritized LT Codes  

NASA Technical Reports Server (NTRS)

The original Luby Transform (LT) coding scheme is extended to account for data transmissions where some information symbols in a message block are more important than others. Prioritized LT codes provide unequal error protection (UEP) of data on an erasure channel by modifying the original LT encoder. The prioritized algorithm improves high-priority data protection without penalizing low-priority data recovery. Moreover, low-latency decoding is also obtained for high-priority data due to fast encoding. Prioritized LT codes only require a slight change in the original encoding algorithm, and no changes at all at the decoder. Hence, with a small complexity increase in the LT encoder, an improved UEP and low-decoding latency performance for high-priority data can be achieved. LT encoding partitions a data stream into fixed-sized message blocks each with a constant number of information symbols. To generate a code symbol from the information symbols in a message, the Robust-Soliton probability distribution is first applied in order to determine the number of information symbols to be used to compute the code symbol. Then, the specific information symbols are chosen uniform randomly from the message block. Finally, the selected information symbols are XORed to form the code symbol. The Prioritized LT code construction includes an additional restriction that code symbols formed by a relatively small number of XORed information symbols select some of these information symbols from the pool of high-priority data. Once high-priority data are fully covered, encoding continues with the conventional LT approach where code symbols are generated by selecting information symbols from the entire message block including all different priorities. Therefore, if code symbols derived from high-priority data experience an unusual high number of erasures, Prioritized LT codes can still reliably recover both high- and low-priority data. This hybrid approach decides not only "how to encode" but also "what to encode" to achieve UEP. Another advantage of the priority encoding process is that the majority of high-priority data can be decoded sooner since only a small number of code symbols are required to reconstruct high-priority data. This approach increases the likelihood that high-priority data is decoded first over low-priority data. The Prioritized LT code scheme achieves an improvement in high-priority data decoding performance as well as overall information recovery without penalizing the decoding of low-priority data, assuming high-priority data is no more than half of a message block. The cost is in the additional complexity required in the encoder. If extra computation resource is available at the transmitter, image, voice, and video transmission quality in terrestrial and space communications can benefit from accurate use of redundancy in protecting data with varying priorities.

Woo, Simon S.; Cheng, Michael K.



Visualizing the evolution of code clones  

Microsoft Academic Search

The knowledge of code clone evolution throughout the history of a software system is essential in comprehending and managing its clones properly and cost-effectively. However, investigating and observing facts in a huge set of text-based data provided by a clone genealogy extractor could be challenging without the support of a visualization tool. In this position paper, we present an idea

Ripon K. Saha; Chanchal K. Roy; Kevin A. Schneider



Coded Modulations for Mobile Satellite Communication Channels  

Microsoft Academic Search

The mobile satellite (MSAT) channel is subject to multipath fading, shadowing, Doppler frequency shift, and adjacent channel interference (ACI). Therefore, transmitted signals face severe amplitude and phase distortions. This dissertation investigates various high performance and low decoding complexity coded modulation schemes for reliable voice and data transmissions over the shadowed mobile satellite channel and the Rayleigh fading channel. The dissertation

Dojun Rhee



Channel coding for satellite mobile channels  

NASA Astrophysics Data System (ADS)

The deployment of channel coding and interleaving to enhance the bit-error performance of a satellite mobile radio channel is addressed for speech and data transmissions. Different convolutional codes (CC) using Viterbi decoding with soft decision are examined with interblock interleaving. Reed-Solomon (RS) codes with Berlekamp-Massey hard decision decoding or soft decision trellis decoding combined with block interleaving are also investigated. A concatenated arrangement employing RS and CC coding as the outer and inner coders, respectively, is used for transmissions via minimum shift keying over Gaussian and Rayleigh fading channels. For an interblock interleaving period of 2880 bits, a concatenated arrangement of an RS(48,36), over the Galois field GF(256) and punctured PCC(3,1,7) yielding an overall coding rate of 1/2, provides a coding gain of 42dB for a BER of 10 to the -6th, and an uncorrectable error detection probability of 1 - 10 to the -9th.

Wong, K. H. H.; Hanzo, L.; Steele, R.



Distributed Joint Source-Channel Coding in Wireless Sensor Networks  

PubMed Central

Considering the fact that sensors are energy-limited and the wireless channel conditions in wireless sensor networks, there is an urgent need for a low-complexity coding method with high compression ratio and noise-resisted features. This paper reviews the progress made in distributed joint source-channel coding which can address this issue. The main existing deployments, from the theory to practice, of distributed joint source-channel coding over the independent channels, the multiple access channels and the broadcast channels are introduced, respectively. To this end, we also present a practical scheme for compressing multiple correlated sources over the independent channels. The simulation results demonstrate the desired efficiency. PMID:22408560

Zhu, Xuqi; Liu, Yu; Zhang, Lin



Distributed joint source-channel coding in wireless sensor networks.  


Considering the fact that sensors are energy-limited and the wireless channel conditions in wireless sensor networks, there is an urgent need for a low-complexity coding method with high compression ratio and noise-resisted features. This paper reviews the progress made in distributed joint source-channel coding which can address this issue. The main existing deployments, from the theory to practice, of distributed joint source-channel coding over the independent channels, the multiple access channels and the broadcast channels are introduced, respectively. To this end, we also present a practical scheme for compressing multiple correlated sources over the independent channels. The simulation results demonstrate the desired efficiency. PMID:22408560

Zhu, Xuqi; Liu, Yu; Zhang, Lin



Update-Efficiency and Local Repairability Limits for Capacity Approaching Codes  

E-print Network

Motivated by distributed storage applications, we investigate the degree to which capacity achieving codes can be efficiently updated when a single information symbol changes, and the degree to which such codes can be ...

Mazumdar, Arya


Application of the MELCOR code to design basis PWR large dry containment analysis  

Microsoft Academic Search

The MELCOR computer code has been developed by Sandia National Laboratories under USNRC sponsorship to provide capability for independently auditing analyses submitted by reactor manufactures and utilities. MELCOR is a fully integrated code (encompassing the reactor coolant system and the containment building) that models the progression of postulated accidents in light water reactor power plants. To assess the adequacy of

Jesse Phillips; Allen Notafrancesco; Jack Lee Tills



Perceptual audio coding using adaptive pre- and post-filters and lossless compression  

Microsoft Academic Search

This paper proposes a versatile perceptual audio coding method that achieves high compression ratios and is capable of low encoding\\/decoding delay. It accommodates a variety of source signals (including both music and speech) with different sampling rates. It is based on separating irrelevance and redundancy reductions into independent functional units. This contrasts traditional audio coding where both are integrated within

Gerald D. T. Schuller; Bin Yu; Dawei Huang; Bernd Edler



Utilizing channel coding information in CIVA-based blind sequence detectors  

Microsoft Academic Search

This paper presents a new approach for blind equalization of communication systems with channel coding. Based on the Channel Independent Viterbi Algorithm (CIVA), the new method performs blind sequence detections applying the information of channel coding. Specifically, the trellis searching can be simplified by considering the linear block channel encoder. The joint CIVA-based detector\\/decoder not only preserves the advantages of

Xiaohua Li



Coding for urologic office procedures.  


This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff. PMID:24182979

Dowling, Robert A; Painter, Mark



Accumulate Repeat Accumulate Coded Modulation  

NASA Technical Reports Server (NTRS)

In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung



Community care: the independent sector.  

PubMed Central

The independent sector, which consists of the voluntary and private sectors, is a vital element in supporting older people in the community. The voluntary sector, coordinated by the Council for Voluntary Service and the National Council for Voluntary Organisations, provides a variety of services, including practical help, reassurance and companionship, and advice, information, campaigning, and advocacy. The private sector owns all of the nursing homes and most of the residential homes and is gradually becoming more involved with the provision of services to help support older people in their own homes. With this increase in size and importance of the independent sector over recent years, there is now a real need for greater communication between the private, voluntary, and statutory agencies in any one region. In some areas, forums made up of representatives of these various sectors meet to discuss relevant issues and construct local policies, thus allowing a more coordinated approach to the delivery of services. Images p740-a p742-a PMID:8819449

Barodawala, S.



]Space Shuttle Independent Assessment Team  

NASA Technical Reports Server (NTRS)

The Shuttle program is one of the most complex engineering activities undertaken anywhere in the world at the present time. The Space Shuttle Independent Assessment Team (SIAT) was chartered in September 1999 by NASA to provide an independent review of the Space Shuttle sub-systems and maintenance practices. During the period from October through December 1999, the team led by Dr. McDonald and comprised of NASA, contractor, and DOD experts reviewed NASA practices, Space Shuffle anomalies, as well as civilian and military aerospace experience. In performing the review, much of a very positive nature was observed by the SIAT, not the least of which was the skill and dedication of the workforce. It is in the unfortunate nature of this type of review that the very positive elements are either not mentioned or dwelt upon. This very complex program has undergone a massive change in structure in the last few years with the transition to a slimmed down, contractor-run operation, the Shuttle Flight Operations Contract (SFOC). This has been accomplished with significant cost savings and without a major incident. This report has identified significant problems that must be addressed to maintain an effective program. These problems are described in each of the Issues, Findings or Observations summarized, and unless noted, appear to be systemic in nature and not confined to any one Shuttle sub-system or element. Specifics are given in the body of the report, along with recommendations to improve the present systems.



Dynamic Expression of Long Non-Coding RNAs (lncRNAs) in Adult Zebrafish  

PubMed Central

Long non-coding RNAs (lncRNA) represent an assorted class of transcripts having little or no protein coding capacity and have recently gained importance for their function as regulators of gene expression. Molecular studies on lncRNA have uncovered multifaceted interactions with protein coding genes. It has been suggested that lncRNAs are an additional layer of regulatory switches involved in gene regulation during development and disease. LncRNAs expressing in specific tissues or cell types during adult stages can have potential roles in form, function, maintenance and repair of tissues and organs. We used RNA sequencing followed by computational analysis to identify tissue restricted lncRNA transcript signatures from five different tissues of adult zebrafish. The present study reports 442 predicted lncRNA transcripts from adult zebrafish tissues out of which 419 were novel lncRNA transcripts. Of these, 77 lncRNAs show predominant tissue restricted expression across the five major tissues investigated. Adult zebrafish brain expressed the largest number of tissue restricted lncRNA transcripts followed by cardiovascular tissue. We also validated the tissue restricted expression of a subset of lncRNAs using independent methods. Our data constitute a useful genomic resource towards understanding the expression of lncRNAs in various tissues in adult zebrafish. Our study is thus a starting point and opens a way towards discovering new molecular interactions of gene expression within the specific adult tissues in the context of maintenance of organ form and function. PMID:24391796

KV, Shamsudheen; Lalwani, Mukesh Kumar; Jalali, Saakshi; Patowary, Ashok; Joshi, Adita; Scaria, Vinod; Sivasubbu, Sridhar



Quantum convolutional stabilizer codes  

E-print Network

commute or anticommute. 3. Every element of G is unitary, since X, Y and Z are all unitary. From equation 2.5, we get, hÃijEjÃji = hÃijMEjÃji = ¡hÃijEjÃji = 0 (2.6) Therefore the code satis¯es the following condition, hÃijEyaEbjÃji = 0 (2.7) whenever E... Hamming distance between all pairs of distinguish- able codewords of the corresponding truncated code Qi. qdi ´ minfqd(jÃ0ii;jÃ"i i) j hÃ0ijÃ"i i = 0g wherejÃ0ii;jÃ"i i 2 Qi (3.10) The minimum distance qdmin of an (n;k) quantum convolutional code...

Chinthamani, Neelima



A Comparative Study on Seismic Analysis of Bangladesh National Building Code (BNBC) with Other Building Codes  

NASA Astrophysics Data System (ADS)

Tectonic framework of Bangladesh and adjoining areas indicate that Bangladesh lies well within an active seismic zone. The after effect of earthquake is more severe in an underdeveloped and a densely populated country like ours than any other developed countries. Bangladesh National Building Code (BNBC) was first established in 1993 to provide guidelines for design and construction of new structure subject to earthquake ground motions in order to minimize the risk to life for all structures. A revision of BNBC 1993 is undergoing to make this up to date with other international building codes. This paper aims at the comparison of various provisions of seismic analysis as given in building codes of different countries. This comparison will give an idea regarding where our country stands when it comes to safety against earth quake. Primarily, various seismic parameters in BNBC 2010 (draft) have been studied and compared with that of BNBC 1993. Later, both 1993 and 2010 edition of BNBC codes have been compared graphically with building codes of other countries such as National Building Code of India 2005 (NBC-India 2005), American Society of Civil Engineering 7-05 (ASCE 7-05). The base shear/weight ratios have been plotted against the height of the building. The investigation in this paper reveals that BNBC 1993 has the least base shear among all the codes. Factored Base shear values of BNBC 2010 are found to have increased significantly than that of BNBC 1993 for low rise buildings (?20 m) around the country than its predecessor. Despite revision of the code, BNBC 2010 (draft) still suggests less base shear values when compared to the Indian and American code. Therefore, this increase in factor of safety against the earthquake imposed by the proposed BNBC 2010 code by suggesting higher values of base shear is appreciable.

Bari, Md. S.; Das, T.



Preliminary Results from the Application of Automated Adjoint Code Generation to CFL3D  

NASA Technical Reports Server (NTRS)

This report describes preliminary results obtained using an automated adjoint code generator for Fortran to augment a widely-used computational fluid dynamics flow solver to compute derivatives. These preliminary results with this augmented code suggest that, even in its infancy, the automated adjoint code generator can accurately and efficiently deliver derivatives for use in transonic Euler-based aerodynamic shape optimization problems with hundreds to thousands of independent design variables.

Carle, Alan; Fagan, Mike; Green, Lawrence L.



Code inspection instructional validation  

NASA Technical Reports Server (NTRS)

The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option will produce the most effective results.

Orr, Kay; Stancil, Shirley



Code-Mixing as a Bilingual Instructional Strategy  

ERIC Educational Resources Information Center

This study investigated code-mixing practices, specifically the use of L2 (English) in an L1 (Chinese) class in a U.S. bilingual program. Our findings indicate that the code-mixing practices made and prompted by the teacher served five pedagogical functions: (a) to enhance students' bilingualism and bilingual learning, (b) to review and…

Jiang, Yih-Lin Belinda; García, Georgia Earnest; Willis, Arlette Ingram



Improved performance of QCD code on ALiCE  

NASA Astrophysics Data System (ADS)

We present results for the performance of QCD code on ALiCE , the Alpha-Linux Cluster Engine at Wuppertal. We describe the techniques employed to optimise the code, including the metaprogramming of assembler kernels, the effects of data layout and an investigation into the overheads incurred by the communication.

Sroczynski, Z.



Performance Analysis of Optical Code Division Multiplex System  

NASA Astrophysics Data System (ADS)

This paper presents the Pseudo-Orthogonal Code generator for Optical Code Division Multiple Access (OCDMA) system which helps to reduce the need of bandwidth expansion and improve spectral efficiency. In this paper we investigate the performance of multi-user OCDMA system to achieve data rate more than 1 Tbit/s.

Kaur, Sandeep; Bhatia, Kamaljit Singh



Distributed storage codes reduce latency in vehicular networks  

Microsoft Academic Search

We investigate the benefits of distributed storage using erasure codes for file sharing in vehicular networks through realistic trace-based simulations. We find that coding offers substantial benefits over simple replication when the file sizes are large compared to the average download bandwidth available per encounter. Our simulations, based on a large real vehicle trace from Beijing combined with a realistic

Maheswaran Sathiamoorthy; Alexandros G. Dimakis; Bhaskar Krishnamachari; Fan Bai



Aeroacoustic Prediction Codes  

NASA Technical Reports Server (NTRS)

This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.

Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)



Securing mobile code.  

SciTech Connect

If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements on this method as well as demonstrating its implementation for various algorithms. We also examine cryptographic techniques to achieve obfuscation including encrypted functions and offer a new application to digital signature algorithms. To better understand the lack of security proofs for obfuscation techniques, we examine in detail general theoretical models of obfuscation. We explain the need for formal models in order to obtain provable security and the progress made in this direction thus far. Finally we tackle the problem of verifying remote execution. We introduce some methods of verifying remote exponentiation computations and some insight into generic computation checking.

Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik



The Code Project  

NSDL National Science Digital Library

The Code Project is an online repository of free tutorials, source code, and articles about a wide variety of programming languages. Sections devoted to C++, HTML, DirectX, and .NET are among the resources available on the site. Discussion forums and message boards are excellent places for developers to get quick answers to their questions from other members of the community (this requires a short registration). Featured articles and industry news keep the site up-to-date. There are some advertisements on the site, but they do not detract from the content.


Molecular analysis of five independent Japanese mutant genes responsible for hypoxanthine guanine phosphoribosyltransferase (HPRT) deficiency  

Microsoft Academic Search

Five independent mutations in the hypoxanthine guanine phosphoribosyltransferase (HPRT) gene were identified in a partially HPRT deficient patient with gout and in four Lesch-Nyhan patients. Using the polymerase chain reaction (PCR) technique coupled with direct sequencing, the nucleotide sequences of the entire HPRT coding region amplified from the cDNA and also of each exon amplified form the genomic DNA were

Yasukazu Yamada; Haruko Goto; Kaoru Suzumori; Ritsuko Adachi; Nobuaki Ogasawara



Independent Representation of Parts and the Relations between Them: Evidence from Integrative Agnosia  

ERIC Educational Resources Information Center

Whether objects are represented as a collection of parts whose relations are coded independently remains a topic of ongoing discussion among theorists in the domain of shape perception. S. M., an individual with integrative agnosia, and neurologically intact ("normal") individuals learned initially to identify 4 target objects constructed of 2…

Behrmann, Marlene; Peterson, Mary A.; Moscovitch, Morris; Suzuki, Satoru



Independent Lens Online Shorts Festival  

NSDL National Science Digital Library

Since its creation a few years ago, the Independent Lens series has worked with various filmmakers and producers to create thoughtful portraits. These portraits have included subjects such as the life of Billy Strayhorn, people living with dystonia, and the world of Ethiopian coffee growers. Recently, they also embarked on yet another ambitious project: an online shorts festival. Visitors to this site can partake of all ten of these films at their leisure. Included are a film that explores a Parisian secret from 1951, a meditation of growing old, and an artist who created a monument out of mud, old paint, and adobe. After viewing the films, visitors are also welcome to leave their comments in the "Talkback" section, submit a film or find out more about the members of the jury for this online film festival.


Is landscape connectivity a dependent or independent variable?  

Microsoft Academic Search

With growing interest in landscape connectivity, it is timely to ask what research has been done and what re mains to be done. I surveyed papers investigating landscape connectivity from 1985 to 2000. From these papers, I determined if connectivity had been treated as an independent or dependent variable, what connectivity metrics were used, and if the study took an

Brett J. Goodwin



The Field Dependence-Independence Construct: Some, One or None.  

ERIC Educational Resources Information Center

The relationship between cognitive restructuring and perception of the upright (tests of which may be used to measure field dependence-independence [FDI]) was investigated. Data analysis of 34 tests administered to high school seniors, including 12 measures of FDI, resulted in five dimensions, including two associated with FDI. (Author/AEF)

Linn, Marcia C.; Kyllonen, Patrick



The Field Dependence-Independence Construct: Some, One, or None.  

ERIC Educational Resources Information Center

The field dependency/independency construct (FDI) was measured using tests of perception of the upright such as the Rod and Frame Test (RFT) and tests of cognitive restructuring such as the Hidden Figures Test (HFT); relationships between cognitive restructing and perception of the upright were investigated. High school seniors received 34 tests…

Linn, Marcia C.; Kyllonen, Patrick


Independents, Actives, and Pledges: A Comparison of Academic Achievement.  

ERIC Educational Resources Information Center

This study investigated differences in academic achievement between undergraduate students involved with Greek organizations and undergraduate students independent of Greek organizations. Subjects (N=593) were undergraduate students at Murray State University in Murray, Kentucky in the fall semester of 1990, subdivided by sex, Greek status…

Porta, Andrew Douglas


Independent Scientific Review Panel for the Northwest Power & Conservation Council  

E-print Network

Independent Scientific Review Panel for the Northwest Power & Conservation Council 851 SW 6th in genetics and fish culture. (ISRP Chair) Katherine Myers, Ph.D., Principal Investigator of the High Seas, and Conservation Biology at Colorado State University. Jack Griffith, Ph. D., Consulting Fisheries Scientist


Distance Learning Enrollments in Independent Institutions. Feasibility Study.  

ERIC Educational Resources Information Center

This study investigated the feasibility of collecting enrollment data on distance learning programs sponsored by private institutions within and outside of Washington State. E-commerce developments have allowed in-state independent providers and out-of-state public institutions to serve residents of Washington State, and many nontraditional…

Washington State Higher Education Coordinating Board, Olympia.


Consequences of Charge Independence for Nuclear Reactions Involving Photons  

Microsoft Academic Search

Some effects of the charge independence of nuclear forces on the emission and absorption of photons by light nuclei are investigated. It is found that the selection rules governing the change of isotopic spin T in such transitions are of practical importance in nuclei with Tz=0, particularly the rule that E1 transitions without change of isotopic spin are forbidden. Two

Murray Gell-Mann; Valentine L. Telegdi



The Effect of Respiration Variations on Independent Component Analysis Results  

E-print Network

The Effect of Respiration Variations on Independent Component Analysis Results of Resting State of breathing are typically not removed. These slower respiration-induced signal changes occur at low, particularly in the default mode network. In this study, we investigate the effect of respiration variations

Baker, Chris I.


Field Independence/Field Dependence and Precocious Kindergarten Readers.  

ERIC Educational Resources Information Center

This study investigated the relation between field independence/field dependence (FI/FD) and reading success. One hundred kindergarten children from a predominantly white, middle-class community were administered a Portable Rod and Frame Test as a measure of cognitive style. The upper and lower 27% were identified and designated Field Dependent…

Cox, Diane K.


International Code Assessment and Applications Program: Summary of code assessment studies concerning RELAP5/MOD2, RELAP5/MOD3, and TRAC-B. International Agreement Report  

SciTech Connect

Members of the International Code Assessment Program (ICAP) have assessed the US Nuclear Regulatory Commission (USNRC) advanced thermal-hydraulic codes over the past few years in a concerted effort to identify deficiencies, to define user guidelines, and to determine the state of each code. The results of sixty-two code assessment reviews, conducted at INEL, are summarized. Code deficiencies are discussed and user recommended nodalizations investigated during the course of conducting the assessment studies and reviews are listed. All the work that is summarized was done using the RELAP5/MOD2, RELAP5/MOD3, and TRAC-B codes.

Schultz, R.R. [EG and G Idaho, Inc., Idaho Falls, ID (United States)



Decode de Code  

NSDL National Science Digital Library

In this activity, users must decode a scientific quote that has been encoded by the computer. The computer will generate an "alphabet" (either random or rotated) and then substitute every letter of the real quote with the computer generated alphabet's letter. To decode the code, you must look for patterns of letters and then substitute guesses for the real letters.


Code of Ethics.  

ERIC Educational Resources Information Center

The American Sociological Association's code of ethics for sociologists is presented. For sociological research and practice, 10 requirements for ethical behavior are identified, including: maintaining objectivity and integrity; fully reporting findings and research methods, without omission of significant data; reporting fully all sources of…

American Sociological Association, Washington, DC.


Multiple trellis coded modulation  

NASA Technical Reports Server (NTRS)

A technique for designing trellis codes to minimize bit error performance for a fading channel. The invention provides a criteria which may be used in the design of such codes which is significantly different from that used for average white Gaussian noise channels. The method of multiple trellis coded modulation of the present invention comprises the steps of: (a) coding b bits of input data into s intermediate outputs; (b) grouping said s intermediate outputs into k groups of s.sub.i intermediate outputs each where the summation of all s.sub.i,s is equal to s and k is equal to at least 2; (c) mapping each of said k groups of intermediate outputs into one of a plurality of symbols in accordance with a plurality of modulation schemes, one for each group such that the first group is mapped in accordance with a first modulation scheme and the second group is mapped in accordance with a second modulation scheme; and (d) outputting each of said symbols to provide k output symbols for each b bits of input data.

Simon, Marvin K. (Inventor); Divsalar, Dariush (Inventor)



Bladder Surgery Codes

Bladder C670–C679 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor destruction, NOS 11 Photodynamic therapy (PDT) 12


Minimal TCB Code Execution  

Microsoft Academic Search

We propose an architecture that allows code to execute in complete isolation from other software while trusting only a tiny software base that is orders of magnitude smaller than even minimalist virtual machine monitors. Our technique also enables more meaningful attestation than previous proposals, since only measurements of the security-sensitive portions of an application need to be included. We achieve

Jonathan M. Mccune; Bryan Parno; Adrian Perrig; Michael K. Reiter; Arvind Seshadri



Stomach Surgery Codes

Stomach C160–C169 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor destruction, NOS 11 Photodynamic therapy (PDT) 12 Electrocautery;


Parotid Surgery Codes

Parotid and Other Unspecified Glands Parotid Gland C079, Major Salivary Glands C080–C089 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY


Code Optimization Techniques  

SciTech Connect

Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flight modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.




Rectum Surgery Codes

Rectum C209 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Code removal/surgical ablation of single or multiple liver metastases under the data item Surgical Procedure/Other Site (NAACCR Item #1294)


Colon Surgery Codes

Colon C180–C189 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Code removal/surgical ablation of single or multiple liver metastases under the data item Surgical Procedure/Other Site (NAACCR Item


Esophagus Coding Guidelines

Coding Guidelines ESOPHAGUS C150-C155, C158-C159 Primary Site There are two systems that divide the esophagus into three subsites. The first system divides the esophagus into the upper third, middle third, and lower third. The second system describes


George Washington's Secret Code  

NSDL National Science Digital Library

In this online interactive, learners decipher codes used by George Washington to safeguard messages during the American Revolution. Learners use a key to decode an excerpt from "Rules of Civility & Decent Behaviour in Company and Conversation," which Washington copied as a writing exercise when he was a teenager and historians believed influenced the development of his character.

Service, National P.




EPA Science Inventory

The Code of Federal Regulations (CFR) is an annually revised codification of the general and permanent rules published in the Federal Register by the executive departments and agencies of the Federal Government. The CFR is divided into 50 titles which represent broad areas subje...


Colon Surgery Codes

C olon C180–C189 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Code removal/surgical ablation of single or multiple liver metastases under the data item Surgical Procedure/Other Site (NAACCR Item


Embedded foveation image coding  

Microsoft Academic Search

The human visual system {(HVS)} is highly space-variant in sampling, coding, processing, and understanding. The spatial resolution of the {HVS} is highest around the point of fixation (foveation point) and decreases rapidly with increasing eccentricity. By taking advantage of this fact, it is possible to remove considerable high-frequency information redundancy from the peripheral regions and still reconstruct a perceptually good

Zhou Wang; Alan Conrad Bovik



Block-coded communications  

NASA Technical Reports Server (NTRS)

Theory for block-coded telemetry systems is useful in testing performance of one- and two-way, phase-coherent telemetry systems when double-conversion, superheterodyne, phase-locked receiver, preceded by bandpass limiter is used to track carrier.

Lindsey, W. C.



Environmental Fluid Dynamics Code  

EPA Science Inventory

The Environmental Fluid Dynamics Code (EFDC)is a state-of-the-art hydrodynamic model that can be used to simulate aquatic systems in one, two, and three dimensions. It has evolved over the past two decades to become one of the most widely used and technically defensible hydrodyn...


Lung Surgery Codes

Lung C340–C349 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 19 Local tumor destruction or excision, NOS Unknown whether a specimen was


Kidney Surgery Codes

Kidney, Renal Pelvis, and Ureter Kidney C649, Renal Pelvis C659, Ureter C669 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor


Motor coordination uses external spatial coordinates independent of developmental vision.  


The constraints that guide bimanual movement coordination are informative about the processing principles underlying movement planning in humans. For example, symmetry relative to the body midline benefits finger and hand movements independent of hand posture. This symmetry constraint has been interpreted to indicate that movement coordination is guided by a perceptual code. Although it has been assumed implicitly that the perceptual system at the heart of this constraint is vision, this relationship has not been tested. Here, congenitally blind and sighted participants made symmetrical and non-symmetrical (that is, parallel) bimanual tapping and finger oscillation movements. For both groups, symmetrical movements were executed more correctly than parallel movements, independent of anatomical constraints like finger homology and hand posture. For the blind, the reliance on external spatial factors in movement coordination stands in stark contrast to their use of an anatomical reference frame in perceptual processing. Thus, the externally coded symmetry constraint evident in bimanual coordination can develop in the absence of the visual system, suggesting that the visual system is not critical for the establishment of an external-spatial reference frame in movement coordination. PMID:24727423

Heed, Tobias; Röder, Brigitte



Stimulus information contaminates summation tests of independent neural representations of features  

NASA Technical Reports Server (NTRS)

Many models of visual processing assume that visual information is analyzed into separable and independent neural codes, or features. A common psychophysical test of independent features is known as a summation study, which measures performance in a detection, discrimination, or visual search task as the number of proposed features increases. Improvement in human performance with increasing number of available features is typically attributed to the summation, or combination, of information across independent neural coding of the features. In many instances, however, increasing the number of available features also increases the stimulus information in the task, as assessed by an optimal observer that does not include the independent neural codes. In a visual search task with spatial frequency and orientation as the component features, a particular set of stimuli were chosen so that all searches had equivalent stimulus information, regardless of the number of features. In this case, human performance did not improve with increasing number of features, implying that the improvement observed with additional features may be due to stimulus information and not the combination across independent features.

Shimozaki, Steven S.; Eckstein, Miguel P.; Abbey, Craig K.



29 CFR 794.116 - “Independently * * * controlled.”  

Code of Federal Regulations, 2010 CFR

... false âIndependently * * * controlled.â 794.116 Section 794.116 Labor Regulations Relating to Labor (Continued...âindependently Owned and Controlled Local Enterpriseâ § 794.116 “Independently * * * controlled.”...



Department of Biological Sciences UNDERGRADUATE INDEPENDENT RESEARCH  

E-print Network

Department of Biological Sciences UNDERGRADUATE INDEPENDENT RESEARCH Student Plan of Study Biol of Study Page | 1 #12;RESEARCH PROPOSAL (Attach separate paper if needed) PROPOSAL TITLE: PROPOSAL TO RESEARCH AREA: PREVIOUS RESEARCH EXPERIENCE: INTERNSHIPS: OTHER: Undergraduate Independent Research Plan

Harms, Kyle E.


Voltage-independent KCNQ4 currents induced by (+/-)BMS-204352.  


The compound BMS-204352 has been targeted for use against acute ischemic stroke, due to its activation of the large-conductance Ca2+-activated K-channel (BK). We have previously described that the racemate (+/-)BMS-204352 reversibly modulates KCNQ4 voltage dependency. Here we show that (+/-)BMS-204352 also induces a voltage-independent KCNQ4 current. The channels were stably expressed in human embryonic kidney cells (HEK293), and investigated by use of the whole-cell mode of the patch-clamp technique. (+/-)BMS-204352 was applied extracellularly (10 microM) in order to precipitate the robust appearance of the voltage-independent current. The voltage-independent KCNQ4 currents were recorded as instantaneous increases in currents upon hyperpolarizing or depolarizing voltage steps elicited from holding potentials of -90 or -110 mV. The voltage-independent current reversed at the equilibrium potential for potassium ( E(K)), hence was carried by a K+ conductance, and was blocked by the selective KCNQ channel blockers XE991 and linopirdine. Similar results were obtained with KCNQ4 channels transiently transfected into Chinese hamster ovary cells (CHO). When (+/-)BMS-204352 was applied to stably expressed BK channels, only the voltage dependency was modulated. Retigabine, the classic activator of KCNQ channels, did not induce voltage-independent currents. Our data indicate that KCNQ4 channels may conduct voltage-dependent and voltage-independent currents in the presence of (+/-)BMS-204352. PMID:12851819

Schrøder, Rikke Louise; Strøbaek, Dorte; Olesen, Søren-Peter; Christophersen, Palle




E-print Network

.D., Associate Professional Scientist at the Illinois Natural History Survey. Linda Hardesty, Ph.D., Associate and Adjunct Professor of Fisheries Biology, Humboldt State University, California. Katherine Myers, Ph.D., Principal Investigator of the High Seas Salmon Research Program at the School of Aquatic and Fishery


nMHDust: A 4-Fluid Partially Ionized Dusty Plasma Code  

NASA Astrophysics Data System (ADS)

nMHDust is a next generation 4-fluid partially ionized magnetized dusty plasma code, treating the inertial dynamics of dust, ion and neutral components. Coded in ANSI C, the numerical method is based on the MHDust 3-fluid fully ionized dusty plasma code. This code expands the features of the MHDust code to include ionization/recombination effects and the netCDF data format. Tests of this code include: ionization instabilities, wave mode propagation (electromagnetic and acoustic), shear-flow instabilities, and magnetic reconnection. Relevant parameters for the space environment are considered, allowing a comparison to be made with previous dusty plasma codes (MHDust and DENISIS). The utility of the code is expanded through the possibility of a small dust mass. This allows nMHDust to be used as a 2-ion plasma code. nMHDust completes the array of fluid dusty plasma codes available for numerical investigations into nonlinear phenomena in the field of astrophysical dusty plasmas.

Lazerson, Samuel



Obituary: Arthur Dodd Code (1923-2009)  

NASA Astrophysics Data System (ADS)

Former AAS president Arthur Dodd Code, age 85, passed away at Meriter Hospital in Madison, Wisconsin on 11 March 2009, from complications involving a long-standing pulmonary condition. Code was born in Brooklyn, New York on 13 August 1923, as the only child of former Canadian businessman Lorne Arthur Code and Jesse (Dodd) Code. An experienced ham radio operator, he entered the University of Chicago in 1940, but then enlisted in the U.S. Navy (1943-45) and was later stationed as an instructor at the Naval Research Laboratory, Washington, D.C. During the war, he gained extensive practical experience with the design and construction of technical equipment that served him well in years ahead. Concurrently, he took physics courses at George Washington University (some under the tutelage of George Gamow). In 1945, he was admitted to the graduate school of the University of Chicago, without having received his formal bachelor's degree. In 1950, he was awarded his Ph.D. for a theoretical study of radiative transfer in O- and B-type stars, directed by Subrahmanyan Chandrasekhar. hired onto the faculty of the Department of Astronomy at the University of Wisconsin-Madison (1951-56). He then accepted a tenured appointment at the California Institute of Technology and the Mount Wilson and Palomar Observatories (1956-58). But following the launch of Sputnik, Code returned to Wisconsin in 1958 as full professor of astronomy, director of the Washburn Observatory, and department chairman so that he could more readily pursue his interest in space astronomy. That same year, he was chosen a member of the Space Science Board of the National Academy of Sciences (created during the International Geophysical Year) and shortly became one of five principal investigators of the original NASA Space Science Working Group. In a cogent 1960 essay, Code argued that astrophysical investigations, when conducted from beyond the Earth's atmosphere, "cannot fail to have a tremendous impact on the future course of stellar astronomy," a prediction strongly borne out in the decades that followed. In 1959, Code founded the Space Astronomy Laboratory (SAL) within the UW Department of Astronomy. Early photometric and spectrographic equipment was test-flown aboard NASA's X-15 rocket plane and Aerobee sounding rockets. Along with other SAL personnel, including Theodore E. Houck, Robert C. Bless, and John F. McNall, Code (as principal investigator) was responsible for the design of the Wisconsin Experiment Package (WEP) as one of two suites of instruments to be flown aboard the Orbiting Astronomical Observatory (OAO), which represented a milestone in the advent of space astronomy. With its seven reflecting telescopes feeding five filter photometers and two scanning spectrometers, WEP permitted the first extended observations in the UV portion of the spectrum. After the complete failure of the OAO-1 spacecraft (launched in 1966), OAO-2 was successfully launched on 7 December 1968 and gathered data on over a thousand celestial objects during the next 50 months, including stars, nebulae, galaxies, planets, and comets. These results appeared in a series of more than 40 research papers, chiefly in the Ap.J., along with the 1972 monograph, The Scientific Results from the Orbiting Astronomical Observatory (OAO-2), edited by Code. Between the OAO launches, other SAL colleagues of Code developed the Wisconsin Automatic Photoelectric Telescope (or APT), the first computer-controlled (or "robotic") telescope. Driven by a PDP-8 mini-computer, it routinely collected atmospheric extinction data. Code was also chosen principal investigator for the Wisconsin Ultraviolet Photo-Polarimeter Experiment (or WUPPE). This used a UV-sensitive polarimeter designed by Kenneth Nordsieck that was flown twice aboard the space shuttles in 1990 and 1995. Among other findings, WUPPE observations demonstrated that interstellar dust does not appreciably change the direction of polarization of starlight, thereby supporting its possible composition as graphite. Code was the recipie

Marché, Jordan D., II



Face representation using independent component analysis  

Microsoft Academic Search

This paper addresses the problem of face recognition using independent component analysis (ICA). More specifically, we are going to address two issues on face representation using ICA. First, as the independent components (ICs) are independent but not orthogonal, images outside a training set cannot be projected into these basis functions directly. In this paper, we propose a least-squares solution method

Pong Chi Yuen; Jian-huang Lai



Independent Evaluation: Insights from Public Accounting  

ERIC Educational Resources Information Center

Background: Maintaining the independence of contract government program evaluation presents significant contracting challenges. The ideal outcome for an agency is often both the impression of an independent evaluation "and" a glowing report. In this, independent evaluation is like financial statement audits: firm management wants both a public…

Brown, Abigail B.; Klerman, Jacob Alex



SCDAP/RELAP5 independent peer review  

SciTech Connect

The SCDAP/RELAP5 code has been developed for best-estimate transient simulation of light-water-reactor coolant systems during severe accidents. The newest version of the code is SCDAP/RELAP5/MOD3. The US Nuclear Regulatory Commission (NRC) decided that there was a need for a broad technical review of the code by recognized experts to determine overall technical adequacy, even though the code is still under development. For this purpose, an eight-member SCDAP/RELAP5 Peer Review Committee was organized, and the outcome of the review should help the NRC prioritize future code-development activity. Because the code is designed to be mechanistic, the Committee used a higher standard for technical adequacy than was employed in the peer review of the parametric MELCOR code. The Committee completed its review of the SCDAP/RELAP5 code, and the findings are documented in this report. Based on these findings, recommendations in five areas are provided: (1) phenomenological models, (2) code-design objectives, (3) code-targeted applications, (4) other findings, and (5) additional recommendations.

Corradini, M.L. [Wisconsin Univ., Madison, WI (United States). Dept. of Nuclear Engineering; Dhir, V.K. [Dhir, (V.K.) Santa Monica, CA (United States); Haste, T.J. [AEA Technology, Winfrith (United Kingdom); Heames, T.J. [Science Applications, Inc., Albuquerque, NM (United States); Jenks, R.P. [Los Alamos National Lab., NM (United States); Kelly, J.E. [Sandia National Labs., Albuquerque, NM (United States); Khatib-Rahbar, M. [Energy Research, Inc., Rockville, MD (United States); Viskanta, R. [Purdue Univ., Lafayette, IN (United States). Heat Transfer Lab.



Preliminary Assessment of Turbomachinery Codes  

NASA Technical Reports Server (NTRS)

This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

Mazumder, Quamrul H.



Quantified PIRT and Uncertainty Quantification for Computer Code Validation  

NASA Astrophysics Data System (ADS)

This study is intended to investigate and propose a systematic method for uncertainty quantification for the computer code validation application. Uncertainty quantification has gained more and more attentions in recent years. U.S. Nuclear Regulatory Commission (NRC) requires the use of realistic best estimate (BE) computer code to follow the rigorous Code Scaling, Application and Uncertainty (CSAU) methodology. In CSAU, the Phenomena Identification and Ranking Table (PIRT) was developed to identify important code uncertainty contributors. To support and examine the traditional PIRT with quantified judgments, this study proposes a novel approach, the Quantified PIRT (QPIRT), to identify important code models and parameters for uncertainty quantification. Dimensionless analysis to code field equations to generate dimensionless groups (pi groups) using code simulation results serves as the foundation for QPIRT. Uncertainty quantification using DAKOTA code is proposed in this study based on the sampling approach. Nonparametric statistical theory identifies the fixed number of code run to assure the 95 percent probability and 95 percent confidence in the code uncertainty intervals.

Luo, Hu


Code Patterns for Automatically Validating Requirements-to-Code Traces  

E-print Network

Code Patterns for Automatically Validating Requirements-to-Code Traces Achraf Ghabi Johannes Kepler University 4040 Linz, Austria Alexander Egyed Johannes Kepler University 4040 Linz

Egyed, Alexander


Mean-based neural coding of voices.  


The social significance of recognizing the person who talks to us is obvious, but the neural mechanisms that mediate talker identification are unclear. Regions along the bilateral superior temporal sulcus (STS) and the inferior frontal cortex (IFC) of the human brain are selective for voices, and they are sensitive to rapid voice changes. Although it has been proposed that voice recognition is supported by prototype-centered voice representations, the involvement of these category-selective cortical regions in the neural coding of such "mean voices" has not previously been demonstrated. Using fMRI in combination with a voice identity learning paradigm, we show that voice-selective regions are involved in the mean-based coding of voice identities. Voice typicality is encoded on a supra-individual level in the right STS along a stimulus-dependent, identity-independent (i.e., voice-acoustic) dimension, and on an intra-individual level in the right IFC along a stimulus-independent, identity-dependent (i.e., voice identity) dimension. Voice recognition therefore entails at least two anatomically separable stages, each characterized by neural mechanisms that reference the central tendencies of voice categories. PMID:23664949

Andics, Attila; McQueen, James M; Petersson, Karl Magnus



Speeding Up Cosmological Boltzmann Codes  

E-print Network

We introduce a novel strategy for cosmological Boltzmann codes leading to an increase in speed by a factor of \\sim 30 for small scale Fourier modes. We (re-)investigate the tight coupling approximation and obtain analytic formulae reaching up to the octupoles of photon intensity and polarization. This leads to accurate results reaching optimal precision, while still being simple. Damping rapid oscillations of small scale modes at later times, we simplify the integration of cosmological perturbations. We obtain analytic expressions for the photon density contrast and velocity as well as an estimate of the quadrupole from after last scattering until today. These analytic formulae hold well during re-ionization and are in fact negligible for realistic cosmological scenarios. However, they do extend the validity of our approach to models with very large optical depth to the last scattering surface.

Michael Doran



Noiseless coding for the magnetometer  

NASA Technical Reports Server (NTRS)

Future unmanned space missions will continue to seek a full understanding of magnetic fields throughout the solar system. Severely constrained data rates during certain portions of these missions could limit the possible science return. This publication investigates the application of universal noiseless coding techniques to more efficiently represent magnetometer data without any loss in data integrity. Performance results indicated that compression factors of 2:1 to 6:1 can be expected. Feasibility for general deep space application was demonstrated by implementing a microprocessor breadboard coder/decoder using the Intel 8086 processor. The Comet Rendezvous Asteroid Flyby mission will incorporate these techniques in a buffer feedback, rate-controlled configuration. The characteristics of this system are discussed.

Rice, Robert F.; Lee, Jun-Ji



Two-dimensional aperture coding for magnetic sector mass spectrometry.  


In mass spectrometer design, there has been a historic belief that there exists a fundamental trade-off between instrument size, throughput, and resolution. When miniaturizing a traditional system, performance loss in either resolution or throughput would be expected. However, in optical spectroscopy, both one-dimensional (1D) and two-dimensional (2D) aperture coding have been used for many years to break a similar trade-off. To provide a viable path to miniaturization for harsh environment field applications, we are investigating similar concepts in sector mass spectrometry. Recently, we demonstrated the viability of 1D aperture coding and here we provide a first investigation of 2D coding. In coded optical spectroscopy, 2D coding is preferred because of increased measurement diversity for improved conditioning and robustness of the result. To investigate its viability in mass spectrometry, analytes of argon, acetone, and ethanol were detected using a custom 90-degree magnetic sector mass spectrometer incorporating 2D coded apertures. We developed a mathematical forward model and reconstruction algorithm to successfully reconstruct the mass spectra from the 2D spatially coded ion positions. This 2D coding enabled a 3.5× throughput increase with minimal decrease in resolution. Several challenges were overcome in the mass spectrometer design to enable this coding, including the need for large uniform ion flux, a wide gap magnetic sector that maintains field uniformity, and a high resolution 2D detection system for ion imaging. Furthermore, micro-fabricated 2D coded apertures incorporating support structures were developed to provide a viable design that allowed ion transmission through the open elements of the code. PMID:25510933

Russell, Zachary E; Chen, Evan X; Amsden, Jason J; Wolter, Scott D; Danell, Ryan M; Parker, Charles B; Stoner, Brian R; Gehm, Michael E; Brady, David J; Glass, Jeffrey T



Statistics Investigations  

NSDL National Science Digital Library

This webpage contains statistics investigations in the form of word problems. The investigations are located on the left hand side of the page on the navigation bar: the links are "Recommended Investigations" and "Additional Investigations". Within each investigation there are additional links to external resources that can be used to solve or illustrate the problem.



Coded Acquisition of High Frame Rate Video  

NASA Astrophysics Data System (ADS)

High frame video (HFV) is an important investigational tool in sciences, engineering and military. In ultra-high speed imaging, the obtainable temporal, spatial and spectral resolutions are limited by the sustainable throughput of in-camera mass memory, the lower bound of exposure time, and illumination conditions. In order to break these bottlenecks, we propose a new coded video acquisition framework that employs K > 2 conventional cameras, each of which makes random measurements of the 3D video signal in both temporal and spatial domains. For each of the K cameras, this multi-camera strategy greatly relaxes the stringent requirements in memory speed, shutter speed, and illumination strength. The recovery of HFV from these random measurements is posed and solved as a large scale l1 minimization problem by exploiting joint temporal and spatial sparsities of the 3D signal. Three coded video acquisition techniques of varied trade offs between performance and hardware complexity are developed: frame-wise coded acquisition, pixel-wise coded acquisition, and column-row-wise coded acquisition. The performances of these techniques are analyzed in relation to the sparsity of the underlying video signal. Simulations of these new HFV capture techniques are carried out and experimental results are reported.

Pournaghi, Reza; Wu, Xiaolin



Coding and Payment Changes for Medicare Drug Administration Codes

CODING AND PAYM ENT CHAN GES FOR MEDICARE DRUG ADMINIST RATION CODES (Payment amounts reflect national averages fo r office-bas e d (non-facility ) setting s ) *See notes below for more informa t ion 2 005 Medicare Co des 2005 Code De scription


New optimal asymmetric quantum codes from constacyclic codes  

NASA Astrophysics Data System (ADS)

In this paper, we construct two classes of asymmetric quantum codes by using constacyclic codes. The first class is the asymmetric quantum codes with parameters [[q2 + 1, q2 + 1 - 2(t + k + 1), (2k + 2)/(2t + 2)

Zhang, Guanghui; Chen, Bocong; Li, Liangchen



Code division multiplexing lightwave networks based upon optical code conversion  

Microsoft Academic Search

Lightwave networks realized through code division multiple access techniques are extensively studied to determine their ultimate capabilities. Here, these concepts are extended to network implementation by introducing an optical code division multiplexing (OCDM) multihop strategy using optical coding. It is shown that this approach is effective in scaling up existing wavelength division multiplexing (WDM) networks without a significant drain of

Ken-ichi Kitayama



Preprototype independent air revitalization subsystem  

NASA Technical Reports Server (NTRS)

The performance and maturity of a preprototype, three-person capacity, automatically controlled and monitored, self-contained independent air revitalization subsystem were evaluated. The subsystem maintains the cabin partial pressure of oxygen at 22 kPa (3.2 psia) and that of carbon dioxide at 400 Pa (3 mm Hg) over a wide range of cabin air relative humidity conditions. Consumption of water vapor by the water vapor electrolysis module also provides partial humidity control of the cabin environment. During operation, the average carbon dioxide removal efficiency at baseline conditions remained constant throughout the test at 84%. The average electrochemical depolarized concentrator cell voltage at the end of the parametric/endurance test was 0.41 V, representing a very slowly decreasing average cell voltage. The average water vapor electrolysis cell voltage increased only at a rate of 20 mu/h from the initial level of 1.67 V to the final level of 1.69 V at conclusion of the testing.

Schubert, F. H.; Hallick, T. M.; Woods, R. R.



Monte Carlo code comparison of dose delivery prediction for microbeam radiation therapy  

NASA Astrophysics Data System (ADS)

Preclinical Microbeam Radiation Therapy (MRT) research programs are carried out at the European Synchrotron Radiation Facility (ESRF) and at a few other synchrotron facilities. MRT needs an accurate evaluation of the doses delivered to biological tissues for carrying out pre-clinical studies. This point is crucial for determining the effect induced by changing any of the physical irradiation parameters. The doses of interest in MRT are normally calculated using Monte Carlo (MC) methods. A few MC packages have been used in the last decade for MRT dose evaluations in independent studies. The aim of this investigation is to provide a preliminary basis to perform a systematic comparison of the dose results obtained, under identical irradiation conditions and for the same scoring geometries with the following five MC codes: EGS4, PENELOPE, GEANT4, EGSnrc, and MCNPX. Dose profiles have been calculated in an in-depth region of cylindrical phantoms made of water or PMMA. Beams in both cylindrical and planar geometry have been considered. This comparison shows an overall agreement among the different codes although minor differences occur, which need further investigations.

DeFelici, M.; Siegbahn, E. A.; Spiga, J.; Hanson, A. L.; Felici, R.; Ferrero, C.; Tartari, A.; Gambaccini, M.; Keyriläinen, J.; Bräuer-Krisch, E.; Randaccio, P.; Bravin, A.



Confocal coded aperture imaging  


A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

Tobin, Jr., Kenneth William (Harriman, TN); Thomas, Jr., Clarence E. (Knoxville, TN)



Watershed Investigations  

ERIC Educational Resources Information Center

Investigating local watersheds presents middle school students with authentic opportunities to engage in inquiry and address questions about their immediate environment. Investigation activities promote learning in an investigations interdisciplinary context as students explore relationships among chemical, biological, physical, geological, and…

Bodzin, Alec; Shive, Louise



Investigator Resources

A Handbook for Clinical Investigators Conducting Therapeutic Clinical Trials Supported by CTEP, DCTD, NCI. The 2014 version 1.2 of the Investigator’s Handbook is a bookmarked PDF file available for download.


Investigator Resources

Skip to Content Home | Investigator Resources | Protocol Development | Initiatives/Programs/Collaborations | Links to More Resources | Funding Opportunities | About CTEP Home | Sitemap | Contact CTEP Search this site Investigator Resources Investigator


Network coding for anonymous broadcast  

E-print Network

This thesis explores the use of network coding for anonymous broadcast. Network coding, the technique of transmitting or storing mixtures of messages rather than individual messages, can provide anonymity with its mixing ...

Sergeev, Ivan A



Software Cartography and Code Navigation  

E-print Network

Software Cartography and Code Navigation Inauguraldissertation der Philosophisch. Nierstrasz Institut f¨ur Informatik und angewandte Mathematik #12;#12;Software Cartography and Code Cartography, an approach to create spatial on- screen visualization of software systems based on non

Nierstrasz, Oscar


Arithmetic coding for data compression  

Microsoft Academic Search

The state of the art in data compression is arithmetic coding, not the better-known Huffman method. Arithmetic coding gives greater compression, is faster for adaptive models, and clearly separates the model from the channel encoding.

Ian H. witten; Radford M. Neal; John Gerald Cleary



Status of MARS Code  

SciTech Connect

Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

N.V. Mokhov



Color-coded Continents  

NSDL National Science Digital Library

Paleogeographic reconstructions for time periods within 620 million years to present are featured on this site. These global paleogeographic maps are viewed by scrolling down the page and are arranged in order of increasing age beginning with the present. Landmasses are color-coded to illustrate the movement of plates through time. The site also discusses how the maps are constructed and what lines of evidence are most commonly used, and includes several links to additional information.

Scotese, C.; Survey, United S.


Reeds computer code  

NASA Technical Reports Server (NTRS)

The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO.

Bjork, C.



TYCHO: Stellar evolution code  

NASA Astrophysics Data System (ADS)

TYCHO is a general, one dimensional (spherically symmetric) stellar evolution code written in structured Fortran 77; it is designed for hydrostatic and hydrodynamic stages including mass loss, accretion, pulsations and explosions. Mixing and convection algorithms are based on 3D time-dependent simulations. It offers extensive on-line graphics using Tim Pearson's PGPLOT with X-windows and runs effectively on Linux and Mac OS X laptop and desktop computers.

Arnett, D.



VAC: Versatile Advection Code  

NASA Astrophysics Data System (ADS)

The Versatile Advection Code (VAC) is a freely available general hydrodynamic and magnetohydrodynamic simulation software that works in 1, 2 or 3 dimensions on Cartesian and logically Cartesian grids. VAC runs on any Unix/Linux system with a Fortran 90 (or 77) compiler and Perl interpreter. VAC can run on parallel machines using either the Message Passing Interface (MPI) library or a High Performance Fortran (HPF) compiler.

Tóth, Gábor; Keppens, Rony



Independent Orbiter Assessment (IOA): Analysis of the instrumentation subsystem  

NASA Technical Reports Server (NTRS)

The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The independent analysis results for the Instrumentation Subsystem are documented. The Instrumentation Subsystem (SS) consists of transducers, signal conditioning equipment, pulse code modulation (PCM) encoding equipment, tape recorders, frequency division multiplexers, and timing equipment. For this analysis, the SS is broken into two major groupings: Operational Instrumentation (OI) equipment and Modular Auxiliary Data System (MADS) equipment. The OI equipment is required to acquire, condition, scale, digitize, interleave/multiplex, format, and distribute operational Orbiter and payload data and voice for display, recording, telemetry, and checkout. It also must provide accurate timing for time critical functions for crew and payload specialist use. The MADS provides additional instrumentation to measure and record selected pressure, temperature, strain, vibration, and event data for post-flight playback and analysis. MADS data is used to assess vehicle responses to the flight environment and to permit correlation of such data from flight to flight. The IOA analysis utilized available SS hardware drawings and schematics for identifying hardware assemblies and components and their interfaces. Criticality for each item was assigned on the basis of the worst-case effect of the failure modes identified.

Howard, B. S.



The "independent components" of natural scenes are edge filters.  


It has previously been suggested that neurons with line and edge selectivities found in primary visual cortex of cats and monkeys form a sparse, distributed representation of natural scenes, and it has been reasoned that such responses should emerge from an unsupervised learning algorithm that attempts to find a factorial code of independent visual features. We show here that a new unsupervised learning algorithm based on information maximization, a nonlinear "infomax" network, when applied to an ensemble of natural scenes produces sets of visual filters that are localized and oriented. Some of these filters are Gabor-like and resemble those produced by the sparseness-maximization network. In addition, the outputs of these filters are as independent as possible, since this infomax network performs Independent Components Analysis or ICA, for sparse (super-gaussian) component distributions. We compare the resulting ICA filters and their associated basis functions, with other decorrelating filters produced by Principal Components Analysis (PCA) and zero-phase whitening filters (ZCA). The ICA filters have more sparsely distributed (kurtotic) outputs on natural scenes. They also resemble the receptive fields of simple cells in visual cortex, which suggests that these neurons form a natural, information-theoretic coordinate system for natural images. PMID:9425547

Bell, A J; Sejnowski, T J



Video coding: Death is not near  

Microsoft Academic Search

Efficient compression and network compatibility are two challenging tasks for video coding. This paper provides an overview of advanced techniques for the next generation video coding including high efficiency video coding, content-based video coding, distributed video coding, scalable video coding and multiple description video coding. Keywords—video compression, video transmission. I. INTRODUCTION With the popularity of digital video applications, efficiently representing

Meilin Yang; Ye He; Fengqing Zhu; Marc Bosch; Mary Comer; Edward J. Delp



MELCOR computer code manuals  

SciTech Connect

MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L. [Sandia National Labs., Albuquerque, NM (United States); Hodge, S.A.; Hyman, C.R.; Sanders, R.L. [Oak Ridge National Lab., TN (United States)



Vision-based reading system for color-coded bar codes  

NASA Astrophysics Data System (ADS)

Barcode systems are used to mark commodities, articles and products with price and article numbers. The advantage of the barcode systems is the safe and rapid availability of the information about the product. The size of the barcode depends on the used barcode system and the resolution of the barcode scanner. Nevertheless, there is a strong correlation between the information content and the length of the barcode. To increase the information content, new 2D-barcode systems like CodaBlock or PDF-417 are introduced. In this paper we present a different way to increase the information content of a barcode and we would like to introduce the color coded barcode. The new color coded barcode is created by offset printing of the three colored barcodes, each barcode with different information. Therefore, three times more information content can be accommodated in the area of a black printed barcode. This kind of color coding is usable in case of the standard 1D- and 2D-barcodes. We developed two reading devices for the color coded barcodes. First, there is a vision based system, consisting of a standard color camera and a PC-based color frame grabber. Omnidirectional barcode decoding is possible with this reading device. Second, a bi-directional handscanner was developed. Both systems use a color separation process to separate the color image of the barcodes into three independent grayscale images. In the case of the handscanner the image consists of one line only. After the color separation the three grayscale barcodes can be decoded with standard image processing methods. In principle, the color coded barcode can be used everywhere instead of the standard barcode. Typical applications with the color coded barcodes are found in the medicine technique, stock running and identification of electronic modules.

Schubert, Erhard; Schroeder, Axel



Time Dependence in Plasma Codes  

E-print Network

Time-dependent plasma codes are a natural extension of static nonequilibrium plasma codes. Comparing relevant timescales will determine whether or not time-dependent treatment is necessary. In this article I outline the ingredients for a time-dependent plasma code in a homogeneous medium and discuss the computational method. In the second half of the article I describe recombination in the early Universe as a detailed example of a problem whose solution requires a time-dependent plasma code.

S. Seager



ALEGRA -- code validation: Experiments and simulations  

SciTech Connect

In this study, the authors are providing an experimental test bed for validating features of the ALEGRA code over a broad range of strain rates with overlapping diagnostics that encompass the multiple responses. A unique feature of the Arbitrary Lagrangian Eulerian Grid for Research Applications (ALEGRA) code is that it allows simultaneous computational treatment, within one code, of a wide range of strain-rates varying from hydrodynamic to structural conditions. This range encompasses strain rates characteristic of shock-wave propagation (10{sup 7}/s) and those characteristic of structural response (10{sup 2}/s). Most previous code validation experimental studies, however, have been restricted to simulating or investigating a single strain-rate regime. What is new and different in this investigation is that the authors have performed well-instrumented experiments which capture features relevant to both hydrodynamic and structural response in a single experiment. Aluminum was chosen for use in this study because it is a well characterized material--its EOS and constitutive material properties are well defined over a wide range of loading rates. The current experiments span strain rate regimes of over 10{sup 7}/s to less than 10{sup 2}/s in a single experiment. The input conditions are extremely well defined. Velocity interferometers are used to record the high strain-rate response, while low strain rate data were collected using strain gauges.

Chhabildas, L.C.; Konrad, C.H.; Mosher, D.A.; Reinhart, W.D; Duggins, B.D.; Rodeman, R.; Trucano, T.G.; Summers, R.M.; Peery, J.S.



IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 16, NO. 11, NOVEMBER 2007 2743 Postprocessing of Low Bit-Rate Block DCT Coded  

E-print Network

Bit-Rate Block DCT Coded Images Based on a Fields of Experts Prior Deqing Sun, Student Member, IEEE (DCT) has been widely used in image and video coding standards, but at low bit rates, the coded images setting. The noise model assumes that the DCT coefficients and their quantization errors are independent

Cham, Wai-kuen


On Quaternary MacDonald Codes  

E-print Network

This paper studies two families of codes over Z 4 , MacDonald codes of type # and type #. The torsion code, weight distribution, and Gray image properties are studied. Some interesting optimal binary codes are also obtained. Two nonlinear families of binary codes are obtained via the Gray map. Keywords: Codes over rings, Gray image, simplex codes, torsion code, weight distribution, MacDonald codes, projective multiset. 1

Charles Colbourn; Manish K. Gupta


Ethical Codes in the Professions.  

ERIC Educational Resources Information Center

Whether the measurement profession should consider developing and adopting a code of professional conduct is explored after a brief review of existing references to standards of conduct and a review of other professional codes. Issues include the need for a code of ethics, its usefulness, and its enforcement. (SLD)

Schmeiser, Cynthia B.



Ptolemy Coding Style Christopher Brooks  

E-print Network

Ptolemy Coding Style Christopher Brooks Edward A. Lee Electrical Engineering and Computer Sciences;Ptolemy Coding Style Christopher Brooks1 , Edward A. Lee1 , 1 {cxh,eal}, 1 University to the academic community. This document describes the coding style used in Ptolemy II, a package with 550K lines


Universal space-time coding  

Microsoft Academic Search

A universal framework is developed for constructing full-rate and full-diversity coherent space-time codes for systems with arbitrary numbers of transmit and receive antennas. The pro- posed framework combines space-time layering concepts with al- gebraic component codes optimized for single-input-single-output (SISO) channels. Each component code is assigned to a \\

Hesham El Gamal; Mohamed Oussama Damen



Coding and Cryptography Chris Wuthrich  

E-print Network

£ ¢ ¡ Coding and Cryptography Chris Wuthrich £ ¢ ¡ #12;Contents Information Codes . . . . . . . . . . . . . . . . . . . . 37 II Cryptography 39 II.1 Modular Arithmetic Problem sheets 63 Bibliography 71 2 #12;Coding and Cryptography G13CCR cw '13 £ ¢ ¡ Essential

Wuthrich, Christian


(significance maps) (distributed source coding;  

E-print Network

(distributed video coding; DVC) )(XH )(YH XY (entropy) ),( YXH XY Shannon (entropy coding theorem) XY] (wavelet transform) SPIHT[13] Wyner-Ziv frame turbo code SPIHT [12] [11][12] side information Turbo Encoder Encoder Buffer Decoder Buffer Turbo Decoder CRC Decoder Reconstruct Wavelet coefficient

Yang, Shih-Hsuan


Hu man minimum redundancy coding  

E-print Network

14 Hu#11;man minimum redundancy coding It has become usual to store data and transmit messages a particular coded text. In a classic paper, published in 1952, David Hu#11;man described an algorithm to #12;nd the set of codes that would minimize the expected length 175 #12; 176 Hu#11;man minimum redundancy

Jones, Geraint


ICICLE: groupware for code inspection  

Microsoft Academic Search

ICICLE1 (“Intelligent Code Inspection Environment in a C Language Environment”) is a multifarious software system intended to augment the process of formal code inspection. It offers assistance in a number of activities, including knowledge-based analysis and annotations of source code, and computer supported cooperative discussion and finalization of inspectors' comments during inspection meetings. This paper reports the implementation of ICICLE

Laurence Brothers; V. Sembugamoorthy; M. Muller



Energy Codes and Standards: Facilities  

SciTech Connect

Energy codes and standards play a vital role in the marketplace by setting minimum requirements for energy-efficient design and construction. They outline uniform requirements for new buildings as well as additions and renovations. This article covers basic knowledge of codes and standards; development processes of each; adoption, implementation, and enforcement of energy codes and standards; and voluntary energy efficiency programs.

Bartlett, Rosemarie; Halverson, Mark A.; Shankle, Diana L.



Noiseless Coding Of Magnetometer Signals  

NASA Technical Reports Server (NTRS)

Report discusses application of noiseless data-compression coding to digitized readings of spaceborne magnetometers for transmission back to Earth. Objective of such coding to increase efficiency by decreasing rate of transmission without sacrificing integrity of data. Adaptive coding compresses data by factors ranging from 2 to 6.

Rice, Robert F.; Lee, Jun-Ji



Development and Application of the MXAN code for XANES Investigations  

E-print Network

. Analysis of the Detailed Configuration of Hydrated Lanthanoid(III) Ions in Aqueous Solution and Crystalline characterization of Hg2+ aqueous solutions. J CHEM PHYS, 128, 84502 - 84508 (2008). · P. D'Angelo, V. Migliorati, G mutation on human prion protein non-octarepeat copper-binding site BIOCHEMISTRY, 51, 6068-6079 (2012). · A

Guidoni, Leonardo


Robust Space Time Code for Channel Coded MIMO Systems  

NASA Astrophysics Data System (ADS)

Various space time code (STC) designs have been proposed to obtain full diversity at full rate in multiple-input multiple-output (MIMO) channels for uncoded systems. However, commercial wireless systems typically employ powerful channel codes such as turbo codes and low density parity check (LDPC) codes together with an STC. For these applications, an STC optimized for uncoded systems may not provide the best performance. In this paper, an STC with relatively good performance over a wide range of code rates is proposed. Simulation results show that the performance of the proposed robust STC is very close to the best performance of the SM and the Golden code in various code rates.

Byun, Ilmu; Hwang, Hae Gwang; Sang, Young Jin; Kim, Kwang Soon


Low Density Parity Check Codes: Bandwidth Efficient Channel Coding  

NASA Technical Reports Server (NTRS)

Low Density Parity Check (LDPC) Codes provide near-Shannon Capacity performance for NASA Missions. These codes have high coding rates R=0.82 and 0.875 with moderate code lengths, n=4096 and 8176. Their decoders have inherently parallel structures which allows for high-speed implementation. Two codes based on Euclidean Geometry (EG) were selected for flight ASIC implementation. These codes are cyclic and quasi-cyclic in nature and therefore have a simple encoder structure. This results in power and size benefits. These codes also have a large minimum distance as much as d,,, = 65 giving them powerful error correcting capabilities and error floors less than lo- BER. This paper will present development of the LDPC flight encoder and decoder, its applications and status.

Fong, Wai; Lin, Shu; Maki, Gary; Yeh, Pen-Shu



Structured error recovery for code-word-stabilized quantum codes  

SciTech Connect

Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3{sup t} times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

Li Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P. [Department of Electrical Engineering, University of California, Riverside, California 92521 (United States); Centre for Quantum Technologies, National University of Singapore, Singapore 117543 (Singapore); Department of Physics and Astronomy, University of California, Riverside, California 92521 (United States)



SCAMPI: A code package for cross-section processing  

SciTech Connect

The SCAMPI code package consists of a set of SCALE and AMPX modules that have been assembled to facilitate user needs for preparation of problem-specific, multigroup cross-section libraries. The function of each module contained in the SCANTI code package is discussed, along with illustrations of their use in practical analyses. Ideas are presented for future work that can enable one-step processing from a fine-group, problem-independent library to a broad-group, problem-specific library ready for a shielding analysis.

Parks, C.V.; Petrie, L.M.; Bowman, S.M.; Broadhead, B.L.; Greene, N.M.; White, J.E.



QR Codes as Finding Aides: Linking Electronic and Print Library Resources  

ERIC Educational Resources Information Center

As part of a focused, methodical, and evaluative approach to emerging technologies, QR codes are one of many new technologies being used by the UC Irvine Libraries. QR codes provide simple connections between print and virtual resources. In summer 2010, a small task force began to investigate how QR codes could be used to provide information and…

Kane, Danielle; Schneidewind, Jeff



DNA Methylation-Independent Reversion of Gemcitabine Resistance by Hydralazine in Cervical Cancer Cells  

PubMed Central

Background Down regulation of genes coding for nucleoside transporters and drug metabolism responsible for uptake and metabolic activation of the nucleoside gemcitabine is related with acquired tumor resistance against this agent. Hydralazine has been shown to reverse doxorubicin resistance in a model of breast cancer. Here we wanted to investigate whether epigenetic mechanisms are responsible for acquiring resistance to gemcitabine and if hydralazine could restore gemcitabine sensitivity in cervical cancer cells. Methodology/Principal Findings The cervical cancer cell line CaLo cell line was cultured in the presence of increasing concentrations of gemcitabine. Down-regulation of hENT1 & dCK genes was observed in the resistant cells (CaLoGR) which was not associated with promoter methylation. Treatment with hydralazine reversed gemcitabine resistance and led to hENT1 and dCK gene reactivation in a DNA promoter methylation-independent manner. No changes in HDAC total activity nor in H3 and H4 acetylation at these promoters were observed. ChIP analysis showed H3K9m2 at hENT1 and dCK gene promoters which correlated with hyper-expression of G9A histone methyltransferase at RNA and protein level in the resistant cells. Hydralazine inhibited G9A methyltransferase activity in vitro and depletion of the G9A gene by iRNA restored gemcitabine sensitivity. Conclusions/Significance Our results demonstrate that acquired gemcitabine resistance is associated with DNA promoter methylation-independent hENT1 and dCK gene down-regulation and hyper-expression of G9A methyltransferase. Hydralazine reverts gemcitabine resistance in cervical cancer cells via inhibition of G9A histone methyltransferase. PMID:22427797

Candelaria, Myrna; de la Cruz-Hernandez, Erick; Taja-Chayeb, Lucia; Perez-Cardenas, Enrique; Trejo-Becerril, Catalina; Gonzalez-Fierro, Aurora; Chavez-Blanco, Alma; Soto-Reyes, Ernesto; Dominguez, Guadalupe; Trujillo, Jaenai E.; Diaz-Chavez, Jose; Duenas-Gonzalez, Alfonso



Extension of XGC kinetic simulation codes to magnetic mirror configurations  

NASA Astrophysics Data System (ADS)

The XGC codes, developed to simulate the edge regions of tokamak plasmas, are modified to carry out kinetic simulations of axisymmetric magnetic mirror configurations. The XGC codes are particle in cell kinetic codes that include a virtual sheath condition where magnetic field lines run into end plates. The XGC1 code is a fully five dimensional kinetic code that is used to investigate turbulence, while the faster XGC0 code uses the axisymmetric average electrostatic potential in order to simulate charged particle drifts, losses and collisional effects. Kinetic electron computations, neutral beam injection, atomic physics and the effects of thermal neutrals are included in the XGC codes. Changes are being made to allow the XGC codes to accept mirror equilibria and to run without a toroidal magnetic field component. The XGC0 code will be used to compute particle dynamics, electrostatic potentials, and moments of the distribution functions including plasma flows in mirror configurations. [1] C.S. Chang, S. Ku, H. Weitzner, Phys. Plasmas 11 (2004) 2649

Bateman, G.; Pankin, A. Y.; Kritz, A. H.; Rafiq, T.; Park, G. Y.; Ku, S.; Chang, C. S.; Horton, W.; Pratt, J.



Correlation approach to identify coding regions in DNA sequences  

NASA Technical Reports Server (NTRS)

Recently, it was observed that noncoding regions of DNA sequences possess long-range power-law correlations, whereas coding regions typically display only short-range correlations. We develop an algorithm based on this finding that enables investigators to perform a statistical analysis on long DNA sequences to locate possible coding regions. The algorithm is particularly successful in predicting the location of lengthy coding regions. For example, for the complete genome of yeast chromosome III (315,344 nucleotides), at least 82% of the predictions correspond to putative coding regions; the algorithm correctly identified all coding regions larger than 3000 nucleotides, 92% of coding regions between 2000 and 3000 nucleotides long, and 79% of coding regions between 1000 and 2000 nucleotides. The predictive ability of this new algorithm supports the claim that there is a fundamental difference in the correlation property between coding and noncoding sequences. This algorithm, which is not species-dependent, can be implemented with other techniques for rapidly and accurately locating relatively long coding regions in genomic sequences.

Ossadnik, S. M.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Mantegna, R. N.; Peng, C. K.; Simons, M.; Stanley, H. E.



The 5'-untranslated regions of picornavirus RNAs contain independent functional domains essential for RNA replication and translation.  

PubMed Central

The role of the 5'-untranslated region (5'UTR) in the replication of enteroviruses has been studied by using a series of poliovirus type 3 (PV3) replicons containing the chloramphenicol acetyltransferase reporter gene in which the 5'UTR was replaced by the 5'UTR of either coxsackievirus B4 or human rhinovirus 14 or composite 5'UTRs derived from sequences of PV3, human rhinovirus 14, coxsackievirus B4, or encephalomyocarditis virus. The results indicate that efficient replication of an enterovirus genome requires a compatible interaction between the 5'-terminal cloverleaf structure and the coding and/or 3'-noncoding regions of the genome. A crucial determinant of this interaction is the stem-loop formed by nucleotides 46 to 81 (stem-loop d). The independence of the cloverleaf structure formed by the 5'-terminal 88 nucleotides and the ribosome landing pad or internal ribosome entry site (IRES) was investigated by constructing a 5'UTR composed of the PV3 cloverleaf and the IRES from encephalomyocarditis virus. Chloramphenicol acetyltransferase gene-containing replicons and viruses containing this recombinant 5'UTR showed levels of replication similar to those of the corresponding genomes containing the complete PV3 5'UTR, indicating that the cloverleaf and the IRES may be regarded as functionally independent and nonoverlapping elements. Images PMID:8207812

Rohll, J B; Percy, N; Ley, R; Evans, D J; Almond, J W; Barclay, W S



[MODIS Investigation  

NASA Technical Reports Server (NTRS)

Our first activity is based on delivery of code to Bob Evans (University of Miami) for integration and eventual delivery to the MODIS Science Data Support Team. As we noted in our previous semi-annual report, coding required the development and analysis of an end-to-end model of fluorescence line height (FLH) errors and sensitivity. This model is described in a paper in press in Remote Sensing of the Environment. Once the code was delivered to Miami, we continue to use this error analysis to evaluate proposed changes in MODIS sensor specifications and performance. Simply evaluating such changes on a band by band basis may obscure the true impacts of changes in sensor performance that are manifested in the complete algorithm. This is especially true with FLH that is sensitive to band placement and width. The error model will be used by Howard Gordon (Miami) to evaluate the effects of absorbing aerosols on the FLH algorithm performance. Presently, FLH relies only on simple corrections for atmospheric effects (viewing geometry, Rayleigh scattering) without correcting for aerosols. Our analysis suggests that aerosols should have a small impact relative to changes in the quantum yield of fluorescence in phytoplankton. However, the effect of absorbing aerosol is a new process and will be evaluated by Gordon.

Abbott, Mark R.



Permutation-invariant quantum codes  

NASA Astrophysics Data System (ADS)

A quantum code is a subspace of a Hilbert space of a physical system chosen to be correctable against a given class of errors, where information can be encoded. Ideally, the quantum code lies within the ground space of the physical system. When the physical model is the Heisenberg ferromagnet in the absence of an external magnetic field, the corresponding ground space contains all permutation-invariant states. We use techniques from combinatorics and operator theory to construct families of permutation-invariant quantum codes. These codes have length proportional to t2; one family of codes perfectly corrects arbitrary weight t errors, while the other family of codes approximately correct t spontaneous decay errors. The analysis of our codes' performance with respect to spontaneous decay errors utilizes elementary matrix analysis, where we revisit and extend the quantum error correction criterion of Knill and Laflamme, and Leung, Chuang, Nielsen and Yamamoto.

Ouyang, Yingkai



Continuous-variable topological codes  

NASA Astrophysics Data System (ADS)

Topological code is a stabilizer quantum error correcting code whose generators are local but logical operators are topologically nontrivial and nonlocal. It offers interesting features such as the homological deformations of string operators and anyonic excitations on it. Topological codes are also closely related to the “topological order,” which has been an important concept in condensed-matter physics. In this paper, we consider continuous-variable versions of topological codes, including the toric code by Kitaev [A. Y. Kitaev, Ann. Phys.APNYA60003-491610.1016/S0003-4916(02)00018-0 303, 2 (2003)] with a single type of stabilizer on the checkerboard lattice, and the color code by Bombin and Martin-Delgado [H. Bombin and M. A. Martin-Delgado, Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.97.180501 97, 180501 (2006)]. We show that it is possible to consider continuous-variable analog of these topological codes.

Morimae, Tomoyuki



High performance computing aspects of a dimension independent semi-Lagrangian discontinuous Galerkin code  

E-print Network

The recently developed semi-Lagrangian discontinuous Galerkin approach is used to discretize hyperbolic partial differential equations (usually first order equations). Since these methods are conservative, local in space, and able to limit numerical diffusion, they are considered a promising alternative to more traditional semi-Lagrangian schemes (which are usually based on polynomial or spline interpolation). In this paper, we consider a parallel implementation of a semi-Lagrangian discontinuous Galerkin method for distributed memory systems (so-called clusters). Both strong and weak scaling studies are performed on the Vienna Scientific Cluster 2 (VSC-2). In the case of weak scaling, up to 8192 cores, we observe a parallel efficiency above 0.89 for both two and four dimensional problems. Strong scaling results show good scalability to at least 1024 cores (we consider problems that can be run on a single processor in reasonable time). In addition, we study the scaling of a two dimensional Vlasov--Poisson sol...

Einkemmer, Lukas



Performance Analysis of Wavelength Multiplexed Sac Ocdma Codes in Beat Noise Mitigation in Sac Ocdma Systems  

NASA Astrophysics Data System (ADS)

In this paper we investigate the use of wavelength multiplexed spectral amplitude coding (WM SAC) codes in beat noise mitigation in coherent source SAC OCDMA systems. A WM SAC code is a low weight SAC code, where the whole code structure is repeated diagonally (once or more) in the wavelength domain to achieve the same cardinality as a higher weight SAC code. Results show that for highly populated networks, the WM SAC codes provide better performance than SAC codes. However, for small number of active users the situation is reversed. Apart from their promising improvement in performance, these codes are more flexible and impose less complexity on the system design than their SAC counterparts.

Alhassan, A. M.; Badruddin, N.; Saad, N. M.; Aljunid, S. A.



7 CFR 1206.51 - Independent evaluation.  

Code of Federal Regulations, 2010 CFR

...AGRICULTURE MANGO PROMOTION, RESEARCH, AND INFORMATION Mango Promotion, Research, and Information Order Definitions Promotion, Research, and Information § 1206.51 Independent evaluation. The Board shall, not less often...



Advertising to Bilingual Consumers: The Impact of CodeSwitching on Persuasion  

Microsoft Academic Search

Building on a sociolinguistic framework, our research explores the impact of code-switching on the persuasiveness of marketing messages. Code-switching refers to mixing languages within a sentence, a common practice among bilingual consumers. We investigate how responses to different types of code-switched messages can provide insight into bilingual consumers' persuasion processes. A pilot study reveals a code-switching direction effect such that

David Luna



High-Tech or Low-Tech? Comparing Self-Monitoring Systems to Increase Task Independence for Students with Autism  

ERIC Educational Resources Information Center

Independence is the ultimate goal for students with disabilities, including secondary students with autism. One avenue targeted for increasing independence and decreasing prompt-dependency is through self-monitoring. In this study, investigators sought to determine whether a difference exists in levels of task independence when three students with…

Bouck, Emily C.; Savage, Melissa; Meyer, Nancy K.; Taber-Doughty, Teresa; Hunley, Megan



Multi-rate transmissions on spectral amplitude coding optical code division multiple access system using random diagonal codes  

Microsoft Academic Search

In this paper, we study the use of a new code called random diagonal (RD) code for spectral amplitude coding (SAC) optical code division multiple access (OCDMA) networks, using fiber Bragg-grating (FBG). FBG consists of a fiber segment whose index of reflection varies periodically along its length. RD code is constructed using a code level and data level, one of




Iterative algorithms for lossy source coding  

E-print Network

This thesis explores the problems of lossy source coding and information embedding. For lossy source coding, we analyze low density parity check (LDPC) codes and low density generator matrix (LDGM) codes for quantization ...

Chandar, Venkat (Venkat Bala)



Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access  

NASA Astrophysics Data System (ADS)

Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

Ahmed, Hassan Yousif; Nisar, K. S.



Investigator Resources

CTEP recognizes the importance of encouraging and supporting young investigators as they embark upon a clinical cancer research career. We have implemented several programs geared to assist young investigators navigate the CTEP program and the LOI development and submission process.


Region-based scalable coding for video communications  

NASA Astrophysics Data System (ADS)

This paper presents a region-based scalable coding technique that can be used in interactive transmission of images over networks. This method has a capability of near lossless coding for a specific region of interest (ROI), while the rest of the region is coded with a high quality lossy codec. The enhancement layers add refinement to the quality of the images that have been reconstructed using the basic layer of the intra frame. The proposed coding technique uses multiple quantizers with thresholds (QT) for layering and it creates a bit plane for each layer of both the intra and residual frames. The bit plane is then partitioned into sets of small areas to be coded independently. Run-length and entropy coding are applied to each of the sets to provide scalability for the entire image sets resulting in high picture quality in the end user-specified ROI. We tested this technique by applying it to various test image sequences and we consistently achieved a high level of performance.

Lee, Ji H.; Yoon, Sung H.; Alexander, Winser E.



CRITICA: coding region identification tool invoking comparative analysis  

NASA Technical Reports Server (NTRS)

Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP ( in/pub/critica) and on the World Wide Web (http:/(/)

Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)



Automated coding of diagnoses--three methods compared.  

PubMed Central

In Germany, new legal requirements have raised the importance of the accurate encoding of admission and discharge diseases for in- and outpatients. In response to emerging needs for computer-supported tools we examined three methods for automated coding of German-language free-text diagnosis phrases. We compared a language-independent lexicon-free n-gram approach with one which uses a dictionary of medical morphemes and refines the query by a mapping to SNOMED codes. Both techniques produced a ranked output of possible diagnoses within a vector space framework for retrieval. The results did not reveal any significant difference: The correct diagnosis was found in approximately 40% for three-digit codes, and 30% for four-digit codes. The lexicon-based method was then modified by substituting the vector space ranking by a heuristic approach that capitalizes on the semantic structure of SNOMED, thus raising the number of correct diagnoses significan