NASA Astrophysics Data System (ADS)
Liu, Siwei; Li, Qi; Yu, Hong; Kong, Lingfeng
2017-02-01
Glycogen is important not only for the energy supplementary of oysters, but also for human consumption. High glycogen content can improve the stress survival of oyster. A key enzyme in glycogenesis is glycogen synthase that is encoded by glycogen synthase gene GYS. In this study, the relationship between single nucleotide polymorphisms (SNPs) in coding regions of Crassostrea gigas GYS (Cg-GYS) and individual glycogen content was investigated with 321 individuals from five full-sib families. Single-strand conformation polymorphism (SSCP) procedure was combined with sequencing to confirm individual SNP genotypes of Cg-GYS. Least-square analysis of variance was performed to assess the relationship of variation in glycogen content of C. gigas with single SNP genotype and SNP haplotype. As a consequence, six SNPs were found in coding regions to be significantly associated with glycogen content ( P < 0.01), from which we constructed four main haplotypes due to linkage disequilibrium. Furthermore, the most effective haplotype H2 (GAGGAT) had extremely significant relationship with high glycogen content ( P < 0.0001). These findings revealed the potential influence of Cg-GYS polymorphism on the glycogen content and provided molecular biological information for the selective breeding of good quality traits of C. gigas.
CH-TRU Waste Content Codes (CH-TRUCON)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washington TRU Solutions LLC
2007-08-15
The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less
CH-TRU Waste Content Codes (CH-TRUCON)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washington TRU Solutions LLC
2007-06-15
The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less
CH-TRU Waste Content Codes (CH-TRUCON)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washington TRU Solutions LLC
2007-09-20
The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less
CH-TRU Waste Content Codes (CH-TRUCON)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washington TRU Solutions LLC
2006-06-20
The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less
CH-TRU Waste Content Codes (CH-TRUCON)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washington TRU Solutions LLC
2006-01-18
The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less
CH-TRU Waste Content Codes (CH-TRUCON)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washington TRU Solutions LLC
2006-08-15
The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less
CH-TRU Waste Content Codes (CH-TRUCON)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washington TRU Solutions LLC
2006-12-20
The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less
CH-TRU Waste Content Codes (CH-TRUCON)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washington TRU Solutions LLC
2007-02-15
The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less
CH-TRU Waste Content Codes (CH-TRUCON)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washington TRU Solutions LLC
2006-09-15
The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washington TRU Solutions LLC
2008-01-16
The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less
SMT-Aware Instantaneous Footprint Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roy, Probir; Liu, Xu; Song, Shuaiwen
Modern architectures employ simultaneous multithreading (SMT) to increase thread-level parallelism. SMT threads share many functional units and the whole memory hierarchy of a physical core. Without a careful code design, SMT threads can easily contend with each other for these shared resources, causing severe performance degradation. Minimizing SMT thread contention for HPC applications running on dedicated platforms is very challenging, because they usually spawn threads within Single Program Multiple Data (SPMD) models. To address this important issue, we introduce a simple scheme for SMT-aware code optimization, which aims to reduce the memory contention across SMT threads.
Multi-Core Processor Memory Contention Benchmark Analysis Case Study
NASA Technical Reports Server (NTRS)
Simon, Tyler; McGalliard, James
2009-01-01
Multi-core processors dominate current mainframe, server, and high performance computing (HPC) systems. This paper provides synthetic kernel and natural benchmark results from an HPC system at the NASA Goddard Space Flight Center that illustrate the performance impacts of multi-core (dual- and quad-core) vs. single core processor systems. Analysis of processor design, application source code, and synthetic and natural test results all indicate that multi-core processors can suffer from significant memory subsystem contention compared to similar single-core processors.
Nihilism, relativism, and Engelhardt.
Wreen, M
1998-01-01
This paper is a critical analysis of Tristram Engelhardt's attempts to avoid unrestricted nihilism and relativism. The focus of attention is his recent book, The Foundations of Bioethics (Oxford University Press, 1996). No substantive or "content-full" bioethics (e.g., that of Roman Catholicism or the Samurai) has an intersubjectively verifiable and universally binding foundation, Engelhardt thinks, for unaided secular reason cannot show that any particular substantive morality (or moral code) is correct. He thus seems to be committed to either nihilism or relativism. The first is the view that there is not even one true or valid moral code, and the second is the view that there is a plurality of true or valid moral codes. However, Engelhardt rejects both nihilism and relativism, at least in unrestricted form. Strictly speaking, he himself is a universalist, someone who believes that there is a single true moral code. Two argumentative strategies are employed by him to fend off unconstrained nihilism and relativism. The first argues that although all attempts to establish a content-full morality on the basis of secular reason fail, secular reason can still establish a content-less, purely procedural morality. Although not content-full and incapable of providing positive direction in life, much less a meaning of life, such a morality does limit the range of relativism and nihilism. The second argues that there is a single true, content-full morality. Grace and revelation, however, are needed to make it available to us; secular reason alone is not up to the task. This second line of argument is not pursued in The Foundations at any length, but it does crop up at times, and if it is sound, nihilism and relativism can be much more thoroughly routed than the first line of argument has it. Engelhardt's position and argumentative strategies are exposed at length and accorded a detailed critical examination. In the end, it is concluded that neither strategy will do, and that Engelhardt is probably committed to some form of relativism.
Theory of Coding Informational Simulation.
1981-04-06
reach the valu , cf several thousands; single-prcgressicn represertation of this valu i5. little attractive duc to the unwieldinsss. Here we approachfd a...the moment/torque when contents of location ccunter must be chanqid tc the larger or smaller side. Value and directicn of change are assiqned by the...ths register of transition is formed by the algebraic addition of contained location counter and value of a change in the code of the latter (step
NASA Astrophysics Data System (ADS)
Zhang, Baocheng; Teunissen, Peter J. G.; Yuan, Yunbin; Zhang, Hongxing; Li, Min
2018-04-01
Vertical total electron content (VTEC) parameters estimated using global navigation satellite system (GNSS) data are of great interest for ionosphere sensing. Satellite differential code biases (SDCBs) account for one source of error which, if left uncorrected, can deteriorate performance of positioning, timing and other applications. The customary approach to estimate VTEC along with SDCBs from dual-frequency GNSS data, hereinafter referred to as DF approach, consists of two sequential steps. The first step seeks to retrieve ionospheric observables through the carrier-to-code leveling technique. This observable, related to the slant total electron content (STEC) along the satellite-receiver line-of-sight, is biased also by the SDCBs and the receiver differential code biases (RDCBs). By means of thin-layer ionospheric model, in the second step one is able to isolate the VTEC, the SDCBs and the RDCBs from the ionospheric observables. In this work, we present a single-frequency (SF) approach, enabling the joint estimation of VTEC and SDCBs using low-cost receivers; this approach is also based on two steps and it differs from the DF approach only in the first step, where we turn to the precise point positioning technique to retrieve from the single-frequency GNSS data the ionospheric observables, interpreted as the combination of the STEC, the SDCBs and the biased receiver clocks at the pivot epoch. Our numerical analyses clarify how SF approach performs when being applied to GPS L1 data collected by a single receiver under both calm and disturbed ionospheric conditions. The daily time series of zenith VTEC estimates has an accuracy ranging from a few tenths of a TEC unit (TECU) to approximately 2 TECU. For 73-96% of GPS satellites in view, the daily estimates of SDCBs do not deviate, in absolute value, more than 1 ns from their ground truth values published by the Centre for Orbit Determination in Europe.
Proceedings of the 14th International Conference on the Numerical Simulation of Plasmas
NASA Astrophysics Data System (ADS)
Partial Contents are as follows: Numerical Simulations of the Vlasov-Maxwell Equations by Coupled Particle-Finite Element Methods on Unstructured Meshes; Electromagnetic PIC Simulations Using Finite Elements on Unstructured Grids; Modelling Travelling Wave Output Structures with the Particle-in-Cell Code CONDOR; SST--A Single-Slice Particle Simulation Code; Graphical Display and Animation of Data Produced by Electromagnetic, Particle-in-Cell Codes; A Post-Processor for the PEST Code; Gray Scale Rendering of Beam Profile Data; A 2D Electromagnetic PIC Code for Distributed Memory Parallel Computers; 3-D Electromagnetic PIC Simulation on the NRL Connection Machine; Plasma PIC Simulations on MIMD Computers; Vlasov-Maxwell Algorithm for Electromagnetic Plasma Simulation on Distributed Architectures; MHD Boundary Layer Calculation Using the Vortex Method; and Eulerian Codes for Plasma Simulations.
Khrustalev, Vladislav Victorovich
2009-01-01
We showed that GC-content of nucleotide sequences coding for linear B-cell epitopes of herpes simplex virus type 1 (HSV1) glycoprotein B (gB) is higher than GC-content of sequences coding for epitope-free regions of this glycoprotein (G + C = 73 and 64%, respectively). Linear B-cell epitopes have been predicted in HSV1 gB by BepiPred algorithm ( www.cbs.dtu.dk/services/BepiPred ). Proline is an acrophilic amino acid residue (it is usually situated on the surface of protein globules, and so included in linear B-cell epitopes). Indeed, the level of proline is much higher in predicted epitopes of gB than in epitope-free regions (17.8% versus 1.8%). This amino acid is coded by GC-rich codons (CCX) that can be produced due to nucleotide substitutions caused by mutational GC-pressure. GC-pressure will also lead to disappearance of acrophobic phenylalanine, isoleucine, methionine and tyrosine coded by GC-poor codons. Results of our "in-silico directed mutagenesis" showed that single nonsynonymous substitutions in AT to GC direction in two long epitope-free regions of gB will cause formation of new linear epitopes or elongation of previously existing epitopes flanking these regions in 25% of 539 possible cases. The calculations of GC-content and amino acid content have been performed by CodonChanges algorithm ( www.barkovsky.hotmail.ru ).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhengqiu, C.; Penaflor, C.; Kuehl, J.V.
2006-06-01
The magnoliids represent the largest basal angiosperm clade with four orders, 19 families and 8,500 species. Although several recent angiosperm molecular phylogenies have supported the monophyly of magnoliids and suggested relationships among the orders, the limited number of genes examined resulted in only weak support, and these issues remain controversial. Furthermore, considerable incongruence has resulted in phylogenies supporting three different sets of relationships among magnoliids and the two large angiosperm clades, monocots and eudicots. This is one of the most important remaining issues concerning relationships among basal angiosperms. We sequenced the chloroplast genomes of three magnoliids, Drimys (Canellales), Liriodendron (Magnoliales),more » and Piper (Piperales), and used these data in combination with 32 other completed angiosperm chloroplast genomes to assess phylogenetic relationships among magnoliids. The Drimys and Piper chloroplast genomes are nearly identical in size at 160,606 and 160,624 bp, respectively. The genomes include a pair of inverted repeats of 26,649 bp (Drimys) and 27,039 (Piper), separated by a small single copy region of 18,621 (Drimys) and 18,878 (Piper) and a large single copy region of 88,685 bp (Drimys) and 87,666 bp (Piper). The gene order of both taxa is nearly identical to many other unrearranged angiosperm chloroplast genomes, including Calycanthus, the other published magnoliid genome. Comparisons of angiosperm chloroplast genomes indicate that GC content is not uniformly distributed across the genome. Overall GC content ranges from 34-39%, and coding regions have a substantially higher GC content than non-coding regions (both intergenic spacers and introns). Among protein-coding genes, GC content varies by codon position with 1st codon > 2nd codon > 3rd codon, and it varies by functional group with photosynthetic genes having the highest percentage and NADH genes the lowest. Across the genome, GC content is highest in the inverted repeat due to the presence of rRNA genes and lowest in the small single copy region where most NADH genes are located. Phylogenetic analyses using maximum parsimony and maximum likelihood methods were performed on DNA sequences of 61 protein-coding genes. Trees from both analyses provided strong support for the monophyly of magnoliids and two strongly supported groups were identified, the Canellales/Piperales and the Laurales/Magnoliales. The phylogenies also provided moderate to strong support for the basal position of Amborella, and a sister relationship of magnoliids to a clade that includes monocots and eudicots. The complete sequences of three magnoliid chloroplast genomes provide new data from the largest basal angiosperm clade. Evolutionary comparisons of these new genome sequences, combined with other published angiosperm genome, confirm that GC content is unevenly distributed across the genome by location, codon position, and functional group. Furthermore, phylogenetic analyses provide the strongest support so far for the hypothesis that the magnoliids are sister to a large clade that includes both monocots and eudicots.« less
Documenting the Conversation: A Systematic Review of Library Discovery Layers
ERIC Educational Resources Information Center
Bossaller, Jenny S.; Sandy, Heather Moulaison
2017-01-01
This article describes the results of a systematic review of peer-reviewed, published research articles about "discovery layers," user-friendly interfaces or systems that provide single-search box access to library content. Focusing on articles in LISTA published 2009-2013, a set of 80 articles was coded for community of users, journal…
A single-station empirical model for TEC over the Antarctic Peninsula using GPS-TEC data
NASA Astrophysics Data System (ADS)
Feng, Jiandi; Wang, Zhengtao; Jiang, Weiping; Zhao, Zhenzhen; Zhang, Bingbing
2017-02-01
Compared with regional or global total electron content (TEC) empirical models, single-station TEC empirical models may exhibit higher accuracy in describing TEC spatial and temporal variations for a single station. In this paper, a new single-station empirical total electron content (TEC) model, called SSM-month, for the O'Higgins Station in the Antarctic Peninsula is proposed by using Global Positioning System (GPS)-TEC data from 01 January 2004 to 30 June 2015. The diurnal variation of TEC in the O'Higgins Station may have changing features in different months, sometimes even in opposite forms, because of ionospheric phenomena, such as the Mid-latitude Summer Nighttime Anomaly (MSNA). To avoid the influence of different diurnal variations, the concept of monthly modeling is proposed in this study. The SSM-month model, which is established by month (including 12 submodels that correspond to the 12 months), can effectively describe the diurnal variation of TEC in different months. Each submodel of the SSM-month model exhibits good agreement with GPS-TEC input data. Overall, the SSM-month model fits the input data with a bias of 0.03 TECU (total electron content unit, 1 TECU = 1016 el m-2) and a standard deviation of 2.78 TECU. This model, which benefits from the modeling method, can effectively describe the MSNA phenomenon without implementing any modeling correction. TEC data derived from Center for Orbit Determination in Europe global ionosphere maps (CODE GIMs), International Reference Ionosphere 2012 (IRI2012), and NeQuick are compared with the SSM-month model in the years of 2001 and 2015-2016. Result shows that the SSM-month model exhibits good consistency with CODE GIMs, which is better than that of IRI2012 and NeQuick, in the O'Higgins Station on the test days.
Niu, Zhi-Tao; Liu, Wei; Xue, Qing-Yun; Ding, Xiao-Yu
2014-01-01
The orchid family Orchidaceae is one of the largest angiosperm families, including many species of important economic value. While chloroplast genomes are very informative for systematics and species identification, there is very limited information available on chloroplast genomes in the Orchidaceae. Here, we report the complete chloroplast genomes of the medicinal plant Dendrobium officinale and the ornamental orchid Cypripedium macranthos, demonstrating their gene content and order and potential RNA editing sites. The chloroplast genomes of the above two species and five known photosynthetic orchids showed similarities in structure as well as gene order and content, but differences in the organization of the inverted repeat/small single-copy junction and ndh genes. The organization of the inverted repeat/small single-copy junctions in the chloroplast genomes of these orchids was classified into four types; we propose that inverted repeats flanking the small single-copy region underwent expansion or contraction among Orchidaceae. The AT-rich regions of the ycf1 gene in orchids could be linked to the recombination of inverted repeat/small single-copy junctions. Relative species in orchids displayed similar patterns of variation in ndh gene contents. Furthermore, fifteen highly divergent protein-coding genes were identified, which are useful for phylogenetic analyses in orchids. To test the efficiency of these genes serving as markers in phylogenetic analyses, coding regions of four genes (accD, ccsA, matK, and ycf1) were used as a case study to construct phylogenetic trees in the subfamily Epidendroideae. High support was obtained for placement of previously unlocated subtribes Collabiinae and Dendrobiinae in the subfamily Epidendroideae. Our findings expand understanding of the diversity of orchid chloroplast genomes and provide a reference for study of the molecular systematics of this family. PMID:24911363
Luo, Jing; Hou, Bei-Wei; Niu, Zhi-Tao; Liu, Wei; Xue, Qing-Yun; Ding, Xiao-Yu
2014-01-01
The orchid family Orchidaceae is one of the largest angiosperm families, including many species of important economic value. While chloroplast genomes are very informative for systematics and species identification, there is very limited information available on chloroplast genomes in the Orchidaceae. Here, we report the complete chloroplast genomes of the medicinal plant Dendrobium officinale and the ornamental orchid Cypripedium macranthos, demonstrating their gene content and order and potential RNA editing sites. The chloroplast genomes of the above two species and five known photosynthetic orchids showed similarities in structure as well as gene order and content, but differences in the organization of the inverted repeat/small single-copy junction and ndh genes. The organization of the inverted repeat/small single-copy junctions in the chloroplast genomes of these orchids was classified into four types; we propose that inverted repeats flanking the small single-copy region underwent expansion or contraction among Orchidaceae. The AT-rich regions of the ycf1 gene in orchids could be linked to the recombination of inverted repeat/small single-copy junctions. Relative species in orchids displayed similar patterns of variation in ndh gene contents. Furthermore, fifteen highly divergent protein-coding genes were identified, which are useful for phylogenetic analyses in orchids. To test the efficiency of these genes serving as markers in phylogenetic analyses, coding regions of four genes (accD, ccsA, matK, and ycf1) were used as a case study to construct phylogenetic trees in the subfamily Epidendroideae. High support was obtained for placement of previously unlocated subtribes Collabiinae and Dendrobiinae in the subfamily Epidendroideae. Our findings expand understanding of the diversity of orchid chloroplast genomes and provide a reference for study of the molecular systematics of this family.
Group delay variations of GPS transmitting and receiving antennas
NASA Astrophysics Data System (ADS)
Wanninger, Lambert; Sumaya, Hael; Beer, Susanne
2017-09-01
GPS code pseudorange measurements exhibit group delay variations at the transmitting and the receiving antenna. We calibrated C1 and P2 delay variations with respect to dual-frequency carrier phase observations and obtained nadir-dependent corrections for 32 satellites of the GPS constellation in early 2015 as well as elevation-dependent corrections for 13 receiving antenna models. The combined delay variations reach up to 1.0 m (3.3 ns) in the ionosphere-free linear combination for specific pairs of satellite and receiving antennas. Applying these corrections to the code measurements improves code/carrier single-frequency precise point positioning, ambiguity fixing based on the Melbourne-Wübbena linear combination, and determination of ionospheric total electron content. It also affects fractional cycle biases and differential code biases.
Leonhardt, Bethany L; Kukla, Marina; Belanger, Elizabeth; Chaudoin-Patzoldt, Kelly A; Buck, Kelly D; Minor, Kyle S; Vohs, Jenifer L; Hamm, Jay A; Lysaker, Paul H
2018-03-01
Emerging integrative metacognitive therapies for schizophrenia seek to promote subjective aspects of recovery. Beyond symptom remission, they are concerned with shared meaning-making and intersubjective processes. It is unclear, however, how such therapies should understand and respond to psychotic content that threatens meaning-making in therapeutic contexts. Accordingly, we sought to understand what factors precede and potentially trigger psychotic content within psychotherapy and what aids in resolution and return to meaning-making. Forty-eight transcripts from a single psychotherapy case were analyzed with thematic analysis. Passages of delusional or disorganized content were identified and themes present prior to the emergence and resolution of such material were identified and coded. Themes that preceded the emergence of psychotic content varied across early, middle, and late phases of therapy. Material related to the patient's experience of inadequacy and potential vulnerability, therapist setting boundaries within the therapeutic relationship and making challenges appeared to trigger psychotic content, especially early in treatment. Psychotic content may emerge in session following identifiable antecedents which change over phases of therapy. Attending to psychotic content by assuming a non-hierarchical stance and not dismissing psychotic content may aid in maintaining intersubjectivity and support patient's movements toward recovery in integrative metacognitive therapies.
CACTI: free, open-source software for the sequential coding of behavioral interactions.
Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B
2012-01-01
The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery.
The complete chloroplast genome sequence of Dendrobium officinale.
Yang, Pei; Zhou, Hong; Qian, Jun; Xu, Haibin; Shao, Qingsong; Li, Yonghua; Yao, Hui
2016-01-01
The complete chloroplast sequence of Dendrobium officinale, an endangered and economically important traditional Chinese medicine, was reported and characterized. The genome size is 152,018 bp, with 37.5% GC content. A pair of inverted repeats (IRs) of 26,284 bp are separated by a large single-copy region (LSC, 84,944 bp) and a small single-copy region (SSC, 14,506 bp). The complete cp DNA contains 83 protein-coding genes, 39 tRNA genes and 8 rRNA genes. Fourteen genes contained one or two introns.
Noel, Jonathan K; Xuan, Ziming; Babor, Thomas F
2017-07-03
Beer marketing in the United States is controlled through self-regulation, whereby the beer industry has created a marketing code and enforces its use. We performed a thematic content analysis on beer ads broadcast during a U.S. college athletic event and determined which themes are associated with violations of a self-regulated alcohol marketing code. 289 beer ads broadcast during the U.S. NCAA Men's and Women's 1999-2008 basketball tournaments were assessed for the presence of 23 thematic content areas. Associations between themes and violations of the U.S. Beer Institute's Marketing and Advertising Code were determined using generalized linear models. Humor (61.3%), taste (61.0%), masculinity (49.2%), and enjoyment (36.5%) were the most prevalent content areas. Nine content areas (i.e., conformity, ethnicity, sensation seeking, sociability, romance, special occasions, text responsibility messages, tradition, and individuality) were positively associated with code violations (p < 0.001-0.042). There were significantly more content areas positively associated with code violations than content areas negatively associated with code violations (p < 0.001). Several thematic content areas were positively associated with code violations. The results can inform existing efforts to revise self-regulated alcohol marketing codes to ensure better protection of vulnerable populations. The use of several themes is concerning in relation to adolescent alcohol use and health disparities.
CACTI: Free, Open-Source Software for the Sequential Coding of Behavioral Interactions
Glynn, Lisa H.; Hallgren, Kevin A.; Houck, Jon M.; Moyers, Theresa B.
2012-01-01
The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery. PMID:22815713
Smith, Katherine C; Cukier, Samantha; Jernigan, David H
2014-10-01
We analyzed beer, spirits, and alcopop magazine advertisements to determine adherence to federal and voluntary advertising standards. We assessed the efficacy of these standards in curtailing potentially damaging content and protecting public health. We obtained data from a content analysis of a census of 1795 unique advertising creatives for beer, spirits, and alcopops placed in nationally available magazines between 2008 and 2010. We coded creatives for manifest content and adherence to federal regulations and industry codes. Advertisements largely adhered to existing regulations and codes. We assessed only 23 ads as noncompliant with federal regulations and 38 with industry codes. Content consistent with the codes was, however, often culturally positive in terms of aspirational depictions. In addition, creatives included degrading and sexualized images, promoted risky behavior, and made health claims associated with low-calorie content. Existing codes and regulations are largely followed regarding content but do not adequately protect against content that promotes unhealthy and irresponsible consumption and degrades potentially vulnerable populations in its depictions. Our findings suggest further limitations and enhanced federal oversight may be necessary to protect public health.
Predicting Regulatory Compliance in Beer Advertising on Facebook.
Noel, Jonathan K; Babor, Thomas F
2017-11-01
The prevalence of alcohol advertising has been growing on social media platforms. The purpose of this study was to evaluate alcohol advertising on Facebook for regulatory compliance and thematic content. A total of 50 Budweiser and Bud Light ads posted on Facebook within 1 month of the 2015 NFL Super Bowl were evaluated for compliance with a self-regulated alcohol advertising code and for thematic content. An exploratory sensitivity/specificity analysis was conducted to determine if thematic content could predict code violations. The code violation rate was 82%, with violations prevalent in guidelines prohibiting the association of alcohol with success (Guideline 5) and health benefits (Guideline 3). Overall, 21 thematic content areas were identified. Displaying the product (62%) and adventure/sensation seeking (52%) were the most prevalent. There was perfect specificity (100%) for 10 content areas for detecting any code violation (animals, negative emotions, positive emotions, games/contests/promotions, female characters, minorities, party, sexuality, night-time, sunrise) and high specificity (>80%) for 10 content areas for detecting violations of guidelines intended to protect minors (animals, negative emotions, famous people, friendship, games/contests/promotions, minorities, responsibility messages, sexuality, sunrise, video games). The high prevalence of code violations indicates a failure of self-regulation to prevent potentially harmful content from appearing in alcohol advertising, including explicit code violations (e.g. sexuality). Routine violations indicate an unwillingness to restrict advertising content for public health purposes, and statutory restrictions may be necessary to sufficiently deter alcohol producers from repeatedly violating marketing codes. Violations of a self-regulated alcohol advertising code are prevalent in a sample of beer ads published on Facebook near the US National Football League's Super Bowl. Overall, 16 thematic content areas demonstrated high specificity for code violations. Alcohol advertising codes should be updated to expressly prohibit the use of such content. © The Author 2017. Medical Council on Alcohol and Oxford University Press. All rights reserved.
The complete chloroplast genome sequence of Hibiscus syriacus.
Kwon, Hae-Yun; Kim, Joon-Hyeok; Kim, Sea-Hyun; Park, Ji-Min; Lee, Hyoshin
2016-09-01
The complete chloroplast genome sequence of Hibiscus syriacus L. is presented in this study. The genome is composed of 161 019 bp in length, with a typical circular structure containing a pair of inverted repeats of 25 745 bp of length separated by a large single-copy region and a small single-copy region of 89 698 bp and 19 831 bp of length, respectively. The overall GC content is 36.8%. One hundred and fourteen genes were annotated, including 81 protein-coding genes, 4 ribosomal RNA genes and 29 transfer RNA genes.
Tembrock, Luke R.; Zheng, Shaoyu; Wu, Zhiqiang
2018-01-01
Qat (Catha edulis, Celastraceae) is a woody evergreen species with great economic and cultural importance. It is cultivated for its stimulant alkaloids cathine and cathinone in East Africa and southwest Arabia. However, genome information, especially DNA sequence resources, for C. edulis are limited, hindering studies regarding interspecific and intraspecific relationships. Herein, the complete chloroplast (cp) genome of Catha edulis is reported. This genome is 157,960 bp in length with 37% GC content and is structurally arranged into two 26,577 bp inverted repeats and two single-copy areas. The size of the small single-copy and the large single-copy regions were 18,491 bp and 86,315 bp, respectively. The C. edulis cp genome consists of 129 coding genes including 37 transfer RNA (tRNA) genes, 8 ribosomal RNA (rRNA) genes, and 84 protein coding genes. For those genes, 112 are single copy genes and 17 genes are duplicated in two inverted regions with seven tRNAs, four rRNAs, and six protein coding genes. The phylogenetic relationships resolved from the cp genome of qat and 32 other species confirms the monophyly of Celastraceae. The cp genomes of C. edulis, Euonymus japonicus and seven Celastraceae species lack the rps16 intron, which indicates an intron loss took place among an ancestor of this family. The cp genome of C. edulis provides a highly valuable genetic resource for further phylogenomic research, barcoding and cp transformation in Celastraceae. PMID:29425128
Informational basis of sensory adaptation: entropy and single-spike efficiency in rat barrel cortex.
Adibi, Mehdi; Clifford, Colin W G; Arabzadeh, Ehsan
2013-09-11
We showed recently that exposure to whisker vibrations enhances coding efficiency in rat barrel cortex despite increasing correlations in variability (Adibi et al., 2013). Here, to understand how adaptation achieves this improvement in sensory representation, we decomposed the stimulus information carried in neuronal population activity into its fundamental components in the framework of information theory. In the context of sensory coding, these components are the entropy of the responses across the entire stimulus set (response entropy) and the entropy of the responses conditional on the stimulus (conditional response entropy). We found that adaptation decreased response entropy and conditional response entropy at both the level of single neurons and the pooled activity of neuronal populations. However, the net effect of adaptation was to increase the mutual information because the drop in the conditional entropy outweighed the drop in the response entropy. The information transmitted by a single spike also increased under adaptation. As population size increased, the information content of individual spikes declined but the relative improvement attributable to adaptation was maintained.
Cukier, Samantha; Jernigan, David H.
2014-01-01
Objectives. We analyzed beer, spirits, and alcopop magazine advertisements to determine adherence to federal and voluntary advertising standards. We assessed the efficacy of these standards in curtailing potentially damaging content and protecting public health. Methods. We obtained data from a content analysis of a census of 1795 unique advertising creatives for beer, spirits, and alcopops placed in nationally available magazines between 2008 and 2010. We coded creatives for manifest content and adherence to federal regulations and industry codes. Results. Advertisements largely adhered to existing regulations and codes. We assessed only 23 ads as noncompliant with federal regulations and 38 with industry codes. Content consistent with the codes was, however, often culturally positive in terms of aspirational depictions. In addition, creatives included degrading and sexualized images, promoted risky behavior, and made health claims associated with low-calorie content. Conclusions. Existing codes and regulations are largely followed regarding content but do not adequately protect against content that promotes unhealthy and irresponsible consumption and degrades potentially vulnerable populations in its depictions. Our findings suggest further limitations and enhanced federal oversight may be necessary to protect public health. PMID:24228667
NASA Astrophysics Data System (ADS)
Mylnikova, Anna; Yasyukevich, Yury; Yasyukevich, Anna
2017-04-01
We have developed a technique for vertical total electron content (TEC) and differential code biases (DCBs) estimation using data from a single GPS/GLONASS station. The algorithm is based on TEC expansion into Taylor series in space and time (TayAbsTEC). We perform the validation of the technique using Global Ionospheric Maps (GIM) computed by Center for Orbit Determination in Europe (CODE) and Jet Propulsion Laboratory (JPL). We compared differences between absolute vertical TEC (VTEC) from GIM and VTEC evaluated by TayAbsTEC for 2009 year (solar activity minimum - sunspot number about 0), and for 2014 year (solar activity maximum - sunspot number 110). Since there is difference between VTEC from CODE and VTEC from JPL, we compare TayAbsTEC VTEC with both of them. We found that TayAbsTEC VTEC is closer to CODE VTEC than to JPL VTEC. The difference between TayAbsTEC VTEC and GIM VTEC is more noticeable for solar activity maximum (2014) than for solar activity minimum (2009) for both CODE and JPL. The distribution of VTEC differences is close to Gaussian distribution, so we conclude that results of TayAbsTEC are in the agreement with GIM VTEC. We also compared DCBs evaluated by TayAbsTEC and DCBs from GIM, computed by CODE. The TayAbsTEC DCBs are in good agreement with CODE DCBs for GPS satellites, but differ noticeable for GLONASS. We used DCBs to correct slant TEC to find out which DCBs give better results. Slant TEC correction with CODE DCBs produces negative and nonphysical TEC values. Slant TEC correction with TayAbsTEC DCBs doesn't produce such artifacts. The technique we developed is used for VTEC and DCBs calculation given only local GPS/GLONASS networks data. The evaluated VTEC data are in GIM framework which is handy when various data analyses are made.
The Model 9977 Radioactive Material Packaging Primer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abramczyk, G.
2015-10-09
The Model 9977 Packaging is a single containment drum style radioactive material (RAM) shipping container designed, tested and analyzed to meet the performance requirements of Title 10 the Code of Federal Regulations Part 71. A radioactive material shipping package, in combination with its contents, must perform three functions (please note that the performance criteria specified in the Code of Federal Regulations have alternate limits for normal operations and after accident conditions): Containment, the package must “contain” the radioactive material within it; Shielding, the packaging must limit its users and the public to radiation doses within specified limits; and Subcriticality, themore » package must maintain its radioactive material as subcritical« less
Guimaraes, Ana M S; Toth, Balazs; Santos, Andrea P; do Nascimento, Naíla C; Kritchevsky, Janice E; Messick, Joanne B
2012-11-01
We report the complete genome sequence of "Candidatus Mycoplasma haemolamae," an endemic red-cell pathogen of camelids. The single, circular chromosome has 756,845 bp, a 39.3% G+C content, and 925 coding sequences (CDSs). A great proportion (49.1%) of these CDSs are organized into paralogous gene families, which can now be further explored with regard to antigenic variation.
Tiwari, Sandeep; Jamal, Syed Babar; Oliveira, Leticia Castro; Clermont, Dominique; Bizet, Chantal; Mariano, Diego; de Carvalho, Paulo Vinicius Sanches Daltro; Souza, Flavia; Pereira, Felipe Luiz; de Castro Soares, Siomar; Guimarães, Luis C; Dorella, Fernanda; Carvalho, Alex; Leal, Carlos; Barh, Debmalya; Figueiredo, Henrique; Hassan, Syed Shah; Azevedo, Vasco; Silva, Artur
2016-08-11
In this work, we describe a set of features of Corynebacterium auriscanis CIP 106629 and details of the draft genome sequence and annotation. The genome comprises a 2.5-Mbp-long single circular genome with 1,797 protein-coding genes, 5 rRNA, 50 tRNA, and 403 pseudogenes, with a G+C content of 58.50%. Copyright © 2016 Tiwari et al.
Can you build an iPhone app without writing a single line of code?
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Maskey, M.
2011-12-01
At the last ESIP summer meeting, a study was conducted to explore different commercial tools now available that allow one to create a mobile app without writing a single line of code. The proposed research comprised of two components. First, systematically evaluate different tools to create mobile apps along the dimensions of features and price. Second, create an iPhone app prototype for the ESIP community using some of these tools. The initial assessment classified the currently available tools to create mobile app tools into two categories. The tools that fall under the first category require no programming, but the content for the mobile apps are fed to it either via a web site RSS feed or entered manually. Consequently, these tools only support limited user interactivity. These tools follow the business model of website hosting services. This business model offers a set of templates to the end users with limited customization features to create their content in order to publish to websites. The second category of tools requires programming, but the code can be written in popular languages such as Javascript (compatible with most mobile platforms) rather than mobile app specific languages. For the second component of the study, two ESIP iPhone app prototypes were created. The first prototype required no programming and used the AppMakr tool. Objective C was used to create the second iPhone prototype from scratch and the source code for this prototype is available on the ESIP website. The study concluded that existing tools do make it easy to create a simple mobile app especially if one already has a well designed website. The associated costs are adequate but not cheap. However, if the mobile app has requirements that require interactivity and specialized customization then one needs to work with a mobile app developer.
Quality Scalability Aware Watermarking for Visual Content.
Bhowmik, Deepayan; Abhayaratne, Charith
2016-11-01
Scalable coding-based content adaptation poses serious challenges to traditional watermarking algorithms, which do not consider the scalable coding structure and hence cannot guarantee correct watermark extraction in media consumption chain. In this paper, we propose a novel concept of scalable blind watermarking that ensures more robust watermark extraction at various compression ratios while not effecting the visual quality of host media. The proposed algorithm generates scalable and robust watermarked image code-stream that allows the user to constrain embedding distortion for target content adaptations. The watermarked image code-stream consists of hierarchically nested joint distortion-robustness coding atoms. The code-stream is generated by proposing a new wavelet domain blind watermarking algorithm guided by a quantization based binary tree. The code-stream can be truncated at any distortion-robustness atom to generate the watermarked image with the desired distortion-robustness requirements. A blind extractor is capable of extracting watermark data from the watermarked images. The algorithm is further extended to incorporate a bit-plane discarding-based quantization model used in scalable coding-based content adaptation, e.g., JPEG2000. This improves the robustness against quality scalability of JPEG2000 compression. The simulation results verify the feasibility of the proposed concept, its applications, and its improved robustness against quality scalable content adaptation. Our proposed algorithm also outperforms existing methods showing 35% improvement. In terms of robustness to quality scalable video content adaptation using Motion JPEG2000 and wavelet-based scalable video coding, the proposed method shows major improvement for video watermarking.
The complete chloroplast genome sequence of the medicinal plant Andrographis paniculata.
Ding, Ping; Shao, Yanhua; Li, Qian; Gao, Junli; Zhang, Runjing; Lai, Xiaoping; Wang, Deqin; Zhang, Huiye
2016-07-01
The complete chloroplast genome of Andrographis paniculata, an important medicinal plant with great economic value, has been studied in this article. The genome size is 150,249 bp in length, with 38.3% GC content. A pair of inverted repeats (IRs, 25,300 bp) are separated by a large single copy region (LSC, 82,459 bp) and a small single-copy region (SSC, 17,190 bp). The chloroplast genome contains 114 unique genes, 80 protein-coding genes, 30 tRNA genes and 4 rRNA genes. In these genes, 15 genes contained 1 intron and 3 genes comprised of 2 introns.
The complete chloroplast genome sequence of Dianthus superbus var. longicalycinus.
Gurusamy, Raman; Lee, Do-Hyung; Park, SeonJoo
2016-05-01
The complete chloroplast genome (cpDNA) sequence of Dianthus superbus var. longicalycinus is an economically important traditional Chinese medicine was reported and characterized. The cpDNA of Dianthus superbus var. longicalycinus is 149,539 bp, with 36.3% GC content. A pair of inverted repeats (IRs) of 24,803 bp is separated by a large single-copy region (LSC, 82,805 bp) and a small single-copy region (SSC, 17,128 bp). It encodes 85 protein-coding genes, 36 tRNA genes and 8 rRNA genes. Of 129 individual genes, 13 genes encoded one intron and three genes have two introns.
The complete chloroplast genome sequence of Dendrobium nobile.
Yan, Wenjin; Niu, Zhitao; Zhu, Shuying; Ye, Meirong; Ding, Xiaoyu
2016-11-01
The complete chloroplast (cp) genome sequence of Dendrobium nobile, an endangered and traditional Chinese medicine with important economic value, is presented in this article. The total genome size is 150,793 bp, containing a large single copy (LSC) region (84,939 bp) and a small single copy region (SSC) (13,310 bp) which were separated by two inverted repeat (IRs) regions (26,272 bp). The overall GC contents of the plastid genome were 38.8%. In total, 130 unique genes were annotated and they were consisted of 76 protein-coding genes, 30 tRNA genes and 4 rRNA genes. Fourteen genes contained one or two introns.
The complete chloroplast genome sequence of Curcuma flaviflora (Curcuma).
Zhang, Yan; Deng, Jiabin; Li, Yangyi; Gao, Gang; Ding, Chunbang; Zhang, Li; Zhou, Yonghong; Yang, Ruiwu
2016-09-01
The complete chloroplast (cp) genome of Curcuma flaviflora, a medicinal plant in Southeast Asia, was sequenced. The genome size was 160 478 bp in length, with 36.3% GC content. A pair of inverted repeats (IRs) of 26 946 bp were separated by a large single copy (LSC) of 88 008 bp and a small single copy (SSC) of 18 578 bp, respectively. The cp genome contained 132 annotated genes, including 79 protein coding genes, 30 tRNA genes, and four rRNA genes. And 19 of these genes were duplicated in inverted repeat regions.
Segmentation-driven compound document coding based on H.264/AVC-INTRA.
Zaghetto, Alexandre; de Queiroz, Ricardo L
2007-07-01
In this paper, we explore H.264/AVC operating in intraframe mode to compress a mixed image, i.e., composed of text, graphics, and pictures. Even though mixed contents (compound) documents usually require the use of multiple compressors, we apply a single compressor for both text and pictures. For that, distortion is taken into account differently between text and picture regions. Our approach is to use a segmentation-driven adaptation strategy to change the H.264/AVC quantization parameter on a macroblock by macroblock basis, i.e., we deviate bits from pictorial regions to text in order to keep text edges sharp. We show results of a segmentation driven quantizer adaptation method applied to compress documents. Our reconstructed images have better text sharpness compared to straight unadapted coding, at negligible visual losses on pictorial regions. Our results also highlight the fact that H.264/AVC-INTRA outperforms coders such as JPEG-2000 as a single coder for compound images.
Wang, Shuo; Gao, Li-Zhi
2016-09-01
The complete chloroplast genome of green foxtail (Setaria viridis), a promising model system for C4 photosynthesis, is first reported in this study. The genome harbors a large single copy (LSC) region of 81 016 bp and a small single copy (SSC) region of 12 456 bp separated by a pair of inverted repeat (IRa and IRb) regions of 22 315 bp. GC content is 38.92%. The proportion of coding sequence is 57.97%, comprising of 111 (19 duplicated in IR regions) unique genes, 71 of which are protein-coding genes, four are rRNA genes, and 36 are tRNA genes. Phylogenetic analysis indicated that S. viridis was clustered with its cultivated species S. italica in the tribe Paniceae of the family Poaceae. This newly determined chloroplast genome will provide valuable genetic resources to assist future studies on C4 photosynthesis in grasses.
Content-based multiple bitstream image transmission over noisy channels.
Cao, Lei; Chen, Chang Wen
2002-01-01
In this paper, we propose a novel combined source and channel coding scheme for image transmission over noisy channels. The main feature of the proposed scheme is a systematic decomposition of image sources so that unequal error protection can be applied according to not only bit error sensitivity but also visual content importance. The wavelet transform is adopted to hierarchically decompose the image. The association between the wavelet coefficients and what they represent spatially in the original image is fully exploited so that wavelet blocks are classified based on their corresponding image content. The classification produces wavelet blocks in each class with similar content and statistics, therefore enables high performance source compression using the set partitioning in hierarchical trees (SPIHT) algorithm. To combat the channel noise, an unequal error protection strategy with rate-compatible punctured convolutional/cyclic redundancy check (RCPC/CRC) codes is implemented based on the bit contribution to both peak signal-to-noise ratio (PSNR) and visual quality. At the receiving end, a postprocessing method making use of the SPIHT decoding structure and the classification map is developed to restore the degradation due to the residual error after channel decoding. Experimental results show that the proposed scheme is indeed able to provide protection both for the bits that are more sensitive to errors and for the more important visual content under a noisy transmission environment. In particular, the reconstructed images illustrate consistently better visual quality than using the single-bitstream-based schemes.
Lewkowitz, Adam K; O'Donnell, Betsy E; Nakagawa, Sanae; Vargas, Juan E; Zlatnik, Marya G
2016-03-01
Text4baby is the only free text-message program for pregnancy available. Our objective was to determine whether content differed between Text4baby and popular pregnancy smart phone applications (apps). Researchers enrolled in Text4baby in 2012 and downloaded the four most-popular free pregnancy smart phone apps in July 2013; content was re-extracted in February 2014. Messages were assigned thematic codes. Two researchers coded messages independently before reviewing all the codes jointly to ensure consistency. Logistic regression modeling determined statistical differences between Text4baby and smart phone apps. About 1399 messages were delivered. Of these, 333 messages had content related to more than one theme and were coded as such, resulting in 1820 codes analyzed. Compared to smart phone apps, Text4baby was significantly more likely to have content regarding Postpartum Planning, Seeking Care, Recruitment and Prevention and significantly less likely to mention Normal Pregnancy Symptoms. No messaging program included content regarding postpartum contraception. To improve content without increasing text message number, Text4baby could replace messages on recruitment with messages regarding normal pregnancy symptoms, fetal development and postpartum contraception.
Sorimachi, Kenji; Okayasu, Teiji
2015-01-01
The complete vertebrate mitochondrial genome consists of 13 coding genes. We used this genome to investigate the existence of natural selection in vertebrate evolution. From the complete mitochondrial genomes, we predicted nucleotide contents and then separated these values into coding and non-coding regions. When nucleotide contents of a coding or non-coding region were plotted against the nucleotide content of the complete mitochondrial genomes, we obtained linear regression lines only between homonucleotides and their analogs. On every plot using G or A content purine, G content in aquatic vertebrates was higher than that in terrestrial vertebrates, while A content in aquatic vertebrates was lower than that in terrestrial vertebrates. Based on these relationships, vertebrates were separated into two groups, terrestrial and aquatic. However, using C or T content pyrimidine, clear separation between these two groups was not obtained. The hagfish (Eptatretus burgeri) was further separated from both terrestrial and aquatic vertebrates. Based on these results, nucleotide content relationships predicted from the complete vertebrate mitochondrial genomes reveal the existence of natural selection based on evolutionary separation between terrestrial and aquatic vertebrate groups. In addition, we propose that separation of the two groups might be linked to ammonia detoxification based on high G and low A contents, which encode Glu rich and Lys poor proteins.
Health and nutrition content claims on Australian fast-food websites.
Wellard, Lyndal; Koukoumas, Alexandra; Watson, Wendy L; Hughes, Clare
2017-03-01
To determine the extent that Australian fast-food websites contain nutrition content and health claims, and whether these claims are compliant with the new provisions of the Australia New Zealand Food Standards Code ('the Code'). Systematic content analysis of all web pages to identify nutrition content and health claims. Nutrition information panels were used to determine whether products with claims met Nutrient Profiling Scoring Criteria (NPSC) and qualifying criteria, and to compare them with the Code to determine compliance. Australian websites of forty-four fast-food chains including meals, bakery, ice cream, beverage and salad chains. Any products marketed on the websites using health or nutrition content claims. Of the forty-four fast-food websites, twenty (45 %) had at least one claim. A total of 2094 claims were identified on 371 products, including 1515 nutrition content (72 %) and 579 health claims (28 %). Five fast-food products with health (5 %) and 157 products with nutrition content claims (43 %) did not meet the requirements of the Code to allow them to carry such claims. New provisions in the Code came into effect in January 2016 after a 3-year transition. Food regulatory agencies should review fast-food websites to ensure compliance with the qualifying criteria for nutrition content and health claim regulations. This would prevent consumers from viewing unhealthy foods as healthier choices. Healthy choices could be facilitated by applying NPSC to nutrition content claims. Fast-food chains should be educated on the requirements of the Code regarding claims.
Adolescents' self-presentation on a teen dating web site: a risk-content analysis.
Pujazon-Zazik, Melissa A; Manasse, Stephanie M; Orrell-Valente, Joan K
2012-05-01
To analzye adolescents' profiles on MyLol.net, a teen dating Web site, for risk content. We hypothesized that risk content would vary by age and gender. We selected and coded 752 publicly viewable profiles of adolescents aged 14-18 years for the following five risks: sex, alcohol, drugs, cigarettes, and violence. Of the total profiles, 27.7% contained risk-related content: 15.8% sexual behavior, 13.8% alcohol use, 1.6% drug use, 6.8% cigarette smoking, and .9% violence activity. Being female, "single" relationship status, and use of profanity (p < .05) were associated with risk content. Females' profiles were most likely to include risky content, especially sexual content. Adolescent females who have internalized social norms that place a high value on female sexuality may reflect this in their online profiles. Online mention of interest/involvement in risky behavior may have negative consequences (e.g., cyberbullies and sexual predators). Stronger universal Internet policies and education are needed to help protect adolescents. Copyright © 2012 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
Toth, Balazs; Santos, Andrea P.; do Nascimento, Naíla C.; Kritchevsky, Janice E.
2012-01-01
We report the complete genome sequence of “Candidatus Mycoplasma haemolamae,” an endemic red-cell pathogen of camelids. The single, circular chromosome has 756,845 bp, a 39.3% G+C content, and 925 coding sequences (CDSs). A great proportion (49.1%) of these CDSs are organized into paralogous gene families, which can now be further explored with regard to antigenic variation. PMID:23105057
The complete chloroplast genome of Sinopodophyllum hexandrum Ying (Berberidaceae).
Meng, Lihua; Liu, Ruijuan; Chen, Jianbing; Ding, Chenxu
2017-05-01
The complete nucleotide sequence of the Sinopodophyllum hexandrum Ying chloroplast genome (cpDNA) was determined based on next-generation sequencing technologies in this study. The genome was 157 203 bp in length, containing a pair of inverted repeat (IRa and IRb) regions of 25 960 bp, which were separated by a large single-copy (LSC) region of 87 065 bp and a small single-copy (SSC) region of 18 218 bp, respectively. The cpDNA contained 148 genes, including 96 protein-coding genes, 8 ribosomal RNA genes, and 44 tRNA genes. In these genes, eight harbored a single intron, and two (ycf3 and clpP) contained a couple of introns. The cpDNA AT content of S. hexandrum cpDNA is 61.5%.
An audit of alcohol brand websites.
Gordon, Ross
2011-11-01
The study investigated the nature and content of alcohol brand websites in the UK. The research involved an audit of the websites of the 10 leading alcohol brands by sales in the UK across four categories: lager, spirits, Flavoured Alcoholic Beverages and cider/perry. Each site was visited twice over a 1-month period with site features and content recorded using a pro-forma. The content of websites was then reviewed against the regulatory codes governing broadcast advertising of alcohol. It was found that 27 of 40 leading alcohol brands had a dedicated website. Sites featured sophisticated content, including sports and music sections, games, downloads and competitions. Case studies of two brand websites demonstrate the range of content features on such sites. A review of the application of regulatory codes covering traditional advertising found some content may breach the codes. Study findings illustrate the sophisticated range of content accessible on alcohol brand websites. When applying regulatory codes covering traditional alcohol marketing channels it is apparent that some content on alcohol brand websites would breach the codes. This suggests the regulation of alcohol brand websites may be an issue requiring attention from policymakers. Further research in this area would help inform this process. © 2010 Australasian Professional Society on Alcohol and other Drugs.
Chan, Kin
2018-01-01
Mutations are permanent alterations to the coding content of DNA. They are starting material for the Darwinian evolution of species by natural selection, which has yielded an amazing diversity of life on Earth. Mutations can also be the fundamental basis of serious human maladies, most notably cancers. In this chapter, I describe a highly sensitive reporter system for the molecular genetic analysis of mutagenesis, featuring controlled generation of long stretches of single-stranded DNA in budding yeast cells. This system is ~100- to ~1000-fold more susceptible to mutation than conventional double-stranded DNA reporters, and is well suited for generating large mutational datasets to investigate the properties of mutagens.
Ciliates learn to diagnose and correct classical error syndromes in mating strategies
Clark, Kevin B.
2013-01-01
Preconjugal ciliates learn classical repetition error-correction codes to safeguard mating messages and replies from corruption by “rivals” and local ambient noise. Because individual cells behave as memory channels with Szilárd engine attributes, these coding schemes also might be used to limit, diagnose, and correct mating-signal errors due to noisy intracellular information processing. The present study, therefore, assessed whether heterotrich ciliates effect fault-tolerant signal planning and execution by modifying engine performance, and consequently entropy content of codes, during mock cell–cell communication. Socially meaningful serial vibrations emitted from an ambiguous artificial source initiated ciliate behavioral signaling performances known to advertise mating fitness with varying courtship strategies. Microbes, employing calcium-dependent Hebbian-like decision making, learned to diagnose then correct error syndromes by recursively matching Boltzmann entropies between signal planning and execution stages via “power” or “refrigeration” cycles. All eight serial contraction and reversal strategies incurred errors in entropy magnitude by the execution stage of processing. Absolute errors, however, subtended expected threshold values for single bit-flip errors in three-bit replies, indicating coding schemes protected information content throughout signal production. Ciliate preparedness for vibrations selectively and significantly affected the magnitude and valence of Szilárd engine performance during modal and non-modal strategy corrective cycles. But entropy fidelity for all replies mainly improved across learning trials as refinements in engine efficiency. Fidelity neared maximum levels for only modal signals coded in resilient three-bit repetition error-correction sequences. Together, these findings demonstrate microbes can elevate survival/reproductive success by learning to implement classical fault-tolerant information processing in social contexts. PMID:23966987
Technical Support Document for Version 3.4.0 of the COMcheck Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan
2007-09-14
COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989more » and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards.« less
Xuan, Ziming; Damon, Donna; Noel, Jonathan
2013-01-01
Objectives. We evaluated advertising code violations using the US Beer Institute guidelines for responsible advertising. Methods. We applied the Delphi rating technique to all beer ads (n = 289) broadcast in national markets between 1999 and 2008 during the National Collegiate Athletic Association basketball tournament games. Fifteen public health professionals completed ratings using quantitative scales measuring the content of alcohol advertisements (e.g., perceived actor age, portrayal of excessive drinking) according to 1997 and 2006 versions of the Beer Institute Code. Results. Depending on the code version, exclusion criteria, and scoring method, expert raters found that between 35% and 74% of the ads had code violations. There were significant differences among producers in the frequency with which ads with violations were broadcast, but not in the proportions of unique ads with violations. Guidelines most likely to be violated included the association of beer drinking with social success and the use of content appealing to persons younger than 21 years. Conclusions. The alcohol industry’s current self-regulatory framework is ineffective at preventing content violations but could be improved by the use of new rating procedures designed to better detect content code violations. PMID:23947318
Babor, Thomas F; Xuan, Ziming; Damon, Donna; Noel, Jonathan
2013-10-01
We evaluated advertising code violations using the US Beer Institute guidelines for responsible advertising. We applied the Delphi rating technique to all beer ads (n = 289) broadcast in national markets between 1999 and 2008 during the National Collegiate Athletic Association basketball tournament games. Fifteen public health professionals completed ratings using quantitative scales measuring the content of alcohol advertisements (e.g., perceived actor age, portrayal of excessive drinking) according to 1997 and 2006 versions of the Beer Institute Code. Depending on the code version, exclusion criteria, and scoring method, expert raters found that between 35% and 74% of the ads had code violations. There were significant differences among producers in the frequency with which ads with violations were broadcast, but not in the proportions of unique ads with violations. Guidelines most likely to be violated included the association of beer drinking with social success and the use of content appealing to persons younger than 21 years. The alcohol industry's current self-regulatory framework is ineffective at preventing content violations but could be improved by the use of new rating procedures designed to better detect content code violations.
Compression performance of HEVC and its format range and screen content coding extensions
NASA Astrophysics Data System (ADS)
Li, Bin; Xu, Jizheng; Sullivan, Gary J.
2015-09-01
This paper presents a comparison-based test of the objective compression performance of the High Efficiency Video Coding (HEVC) standard, its format range extensions (RExt), and its draft screen content coding extensions (SCC). The current dominant standard, H.264/MPEG-4 AVC, is used as an anchor reference in the comparison. The conditions used for the comparison tests were designed to reflect relevant application scenarios and to enable a fair comparison to the maximum extent feasible - i.e., using comparable quantization settings, reference frame buffering, intra refresh periods, rate-distortion optimization decision processing, etc. It is noted that such PSNR-based objective comparisons generally provide more conservative estimates of HEVC benefit than are found in subjective studies. The experimental results show that, when compared with H.264/MPEG-4 AVC, HEVC version 1 provides a bit rate savings for equal PSNR of about 23% for all-intra coding, 34% for random access coding, and 38% for low-delay coding. This is consistent with prior studies and the general characterization that HEVC can provide about a bit rate savings of about 50% for equal subjective quality for most applications. The HEVC format range extensions provide a similar bit rate savings of about 13-25% for all-intra coding, 28-33% for random access coding, and 32-38% for low-delay coding at different bit rate ranges. For lossy coding of screen content, the HEVC screen content coding extensions achieve a bit rate savings of about 66%, 63%, and 61% for all-intra coding, random access coding, and low-delay coding, respectively. For lossless coding, the corresponding bit rate savings are about 40%, 33%, and 32%, respectively.
A Content Analysis of Testosterone Websites: Sex, Muscle, and Male Age-Related Thematic Differences
Ivanov, Nicholas; Vuong, Jimmy; Gray, Peter B.
2017-01-01
Male testosterone supplementation is a large and growing industry. How is testosterone marketed to male consumers online? The present exploratory study entailed a content coding analysis of the home pages of 49 websites focused on testosterone supplementation for men in the United States. Four hypotheses concerning anticipated age-related differences in content coding were also tested: more frequent longevity content toward older men, and more frequent social dominance/physical formidability, muscle, and sex content toward younger men. Codes were created based on inductive observations and drawing upon the medical, life history, and human behavioral endocrinology literatures. Approximately half (n = 24) of websites were oriented toward younger men (estimated audience of men 40 years of age or younger) and half (n = 25) toward older men (estimated audience over 40 years of age). Results indicated that the most frequent content codes concerned online sales (e.g., product and purchasing information). Apart from sales information, the most frequent codes concerned, in order, muscle, sex/sexual functioning, low T, energy, fat, strength, aging, and well-being, with all four hypotheses also supported. These findings are interpreted in the light of medical, evolutionary life history, and human behavioral endocrinology approaches. PMID:29025355
A Content Analysis of Testosterone Websites: Sex, Muscle, and Male Age-Related Thematic Differences.
Ivanov, Nicholas; Vuong, Jimmy; Gray, Peter B
2018-03-01
Male testosterone supplementation is a large and growing industry. How is testosterone marketed to male consumers online? The present exploratory study entailed a content coding analysis of the home pages of 49 websites focused on testosterone supplementation for men in the United States. Four hypotheses concerning anticipated age-related differences in content coding were also tested: more frequent longevity content toward older men, and more frequent social dominance/physical formidability, muscle, and sex content toward younger men. Codes were created based on inductive observations and drawing upon the medical, life history, and human behavioral endocrinology literatures. Approximately half ( n = 24) of websites were oriented toward younger men (estimated audience of men 40 years of age or younger) and half ( n = 25) toward older men (estimated audience over 40 years of age). Results indicated that the most frequent content codes concerned online sales (e.g., product and purchasing information). Apart from sales information, the most frequent codes concerned, in order, muscle, sex/sexual functioning, low T, energy, fat, strength, aging, and well-being, with all four hypotheses also supported. These findings are interpreted in the light of medical, evolutionary life history, and human behavioral endocrinology approaches.
Coding for Single-Line Transmission
NASA Technical Reports Server (NTRS)
Madison, L. G.
1983-01-01
Digital transmission code combines data and clock signals into single waveform. MADCODE needs four standard integrated circuits in generator and converter plus five small discrete components. MADCODE allows simple coding and decoding for transmission of digital signals over single line.
Subjective quality evaluation of low-bit-rate video
NASA Astrophysics Data System (ADS)
Masry, Mark; Hemami, Sheila S.; Osberger, Wilfried M.; Rohaly, Ann M.
2001-06-01
A subjective quality evaluation was performed to qualify vie4wre responses to visual defects that appear in low bit rate video at full and reduced frame rates. The stimuli were eight sequences compressed by three motion compensated encoders - Sorenson Video, H.263+ and a Wavelet based coder - operating at five bit/frame rate combinations. The stimulus sequences exhibited obvious coding artifacts whose nature differed across the three coders. The subjective evaluation was performed using the Single Stimulus Continuos Quality Evaluation method of UTI-R Rec. BT.500-8. Viewers watched concatenated coded test sequences and continuously registered the perceived quality using a slider device. Data form 19 viewers was colleted. An analysis of their responses to the presence of various artifacts across the range of possible coding conditions and content is presented. The effects of blockiness and blurriness on perceived quality are examined. The effects of changes in frame rate on perceived quality are found to be related to the nature of the motion in the sequence.
Growth of zinc selenide single crystals by physical vapor transport in microgravity
NASA Technical Reports Server (NTRS)
Rosenberger, Franz
1993-01-01
The goals of this research were the optimization of growth parameters for large (20 mm diameter and length) zinc selenide single crystals with low structural defect density, and the development of a 3-D numerical model for the transport rates to be expected in physical vapor transport under a given set of thermal and geometrical boundary conditions, in order to provide guidance for an advantageous conduct of the growth experiments. In the crystal growth studies, it was decided to exclusively apply the Effusive Ampoule PVT technique (EAPVT) to the growth of ZnSe. In this technique, the accumulation of transport-limiting gaseous components at the growing crystal is suppressed by continuous effusion to vacuum of part of the vapor contents. This is achieved through calibrated leaks in one of the ground joints of the ampoule. Regarding the PVT transport rates, a 3-D spectral code was modified. After introduction of the proper boundary conditions and subroutines for the composition-dependent transport properties, the code reproduced the experimentally determined transport rates for the two cases with strongest convective flux contributions to within the experimental and numerical error.
Nguyen, Thong T; Suryamohan, Kushal; Kuriakose, Boney; Janakiraman, Vasantharajan; Reichelt, Mike; Chaudhuri, Subhra; Guillory, Joseph; Divakaran, Neethu; Rabins, P E; Goel, Ridhi; Deka, Bhabesh; Sarkar, Suman; Ekka, Preety; Tsai, Yu-Chih; Vargas, Derek; Santhosh, Sam; Mohan, Sangeetha; Chin, Chen-Shan; Korlach, Jonas; Thomas, George; Babu, Azariah; Seshagiri, Somasekar
2018-06-12
We sequenced the Hyposidra talaca NPV (HytaNPV) double stranded circular DNA genome using PacBio single molecule sequencing technology. We found that the HytaNPV genome is 139,089 bp long with a GC content of 39.6%. It encodes 141 open reading frames (ORFs) including the 37 baculovirus core genes, 25 genes conserved among lepidopteran baculoviruses, 72 genes known in baculovirus, and 7 genes unique to the HytaNPV genome. It is a group II alphabaculovirus that codes for the F protein and lacks the gp64 gene found in group I alphabaculovirus viruses. Using RNA-seq, we confirmed the expression of the ORFs identified in the HytaNPV genome. Phylogenetic analysis showed HytaNPV to be closest to BusuNPV, SujuNPV and EcobNPV that infect other tea pests, Buzura suppressaria, Sucra jujuba, and Ectropis oblique, respectively. We identified repeat elements and a conserved non-coding baculovirus element in the genome. Analysis of the putative promoter sequences identified motif consistent with the temporal expression of the genes observed in the RNA-seq data.
Ou, Jing; Liu, Jin-Bo; Yao, Fu-Jiao; Wang, Xin-Guo; Wei, Zhao-Ming
2016-01-01
Flour beetles of the genus Tribolium are all pests of stored products and cause severe economic losses every year. The American black flour beetle Tribolium audax is one of the important pest species of flour beetle, and it is also an important quarantine insect. Here we sequenced and characterized the complete mitochondrial genome of T. audax, which was intercepted by Huangpu Custom in maize from America. The complete circular mitochondrial genome (mitogenome) of T. audax was 15,924 bp in length, containing 37 typical coding genes and one non-coding AT-rich region. The mitogenome of T. audax exhibits a gene arrangement and content identical to the most common type in insects. All protein coding genes (PCGs) are start with a typical ATN initiation codon, except for the cox1, which use AAC as its start codon instead of ATN. Eleven genes use standard complete termination codon (nine TAA, two TAG), whereas the nad4 and nad5 genes end with single T. Except for trnS1 (AGN), all tRNA genes display typical secondary cloverleaf structures as those of other insects. The sizes of the large and small ribosomal RNA genes are 1288 and 780 bp, respectively. The AT content of the AT-rich region is 81.36%. The 5 bp conserved motif TACTA was found in the intergenic region between trnS2 (UCN) and nad1.
Use of Code-Switching in Multilingual Content Subject and Language Classrooms
ERIC Educational Resources Information Center
Gwee, Susan; Saravanan, Vanithamani
2018-01-01
Research literature has shown that teachers code-switched to a language which is not the medium of instruction to help students understand subject matter and establish interpersonal relations with them. However, little is known about the extent to which teachers code-switch in content subject classrooms compared to language classrooms. Using…
The whole chloroplast genome of wild rice (Oryza australiensis).
Wu, Zhiqiang; Ge, Song
2016-01-01
The whole chloroplast genome of wild rice (Oryza australiensis) is characterized in this study. The genome size is 135,224 bp, exhibiting a typical circular structure including a pair of 25,776 bp inverted repeats (IRa,b) separated by a large single-copy region (LSC) of 82,212 bp and a small single-copy region (SSC) of 12,470 bp. The overall GC content of the genome is 38.95%. 110 unique genes were annotated, including 76 protein-coding genes, 4 ribosomal RNA genes, and 30t RNA genes. Among these, 18 are duplicated in the inverted repeat regions, 13 genes contain one intron, and 2 genes (rps12 and ycf3) have two introns.
The complete mitochondrial genome sequence of Malus hupehensis var. pinyiensis.
Duan, Naibin; Sun, Honghe; Wang, Nan; Fei, Zhangjun; Chen, Xuesen
2016-07-01
The complete mitochondrial genome sequence of Malus hupehensis var. pinyiensis, a widely used apple rootstock, was determined using the Illumina high-throughput sequencing approach. The genome is 422,555 bp in length and has a GC content of 45.21%. It is separated by a pair of inverted repeats of 32,504 bp, to form a large single copy region of 213,055 bp and a small single copy region of 144,492 bp. The genome contains 38 protein-coding genes, four pseudogenes, 25 tRNA genes, and three rRNA genes. The genome is 25,608 bp longer than that of M. domestica, and several structural variations between these two mitogenomes were detected.
Technical Support Document for Version 3.9.0 of the COMcheck Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan
2011-09-01
COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989more » and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards. Beginning with COMcheck version 3.8.0, support for 90.1-1989, 90.1-1999, and the 1998 IECC are no longer included, but those sections remain in this document for reference purposes.« less
Technical Support Document for Version 3.9.1 of the COMcheck Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan
2012-09-01
COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989more » and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards. Beginning with COMcheck version 3.8.0, support for 90.1-1989, 90.1-1999, and the 1998 IECC and version 3.9.0 support for 2000 and 2001 IECC are no longer included, but those sections remain in this document for reference purposes.« less
1981-10-01
additional summary status or more detailed status. 1703 The six bits ( DIO1 -DI05 and DI08) may be used in any manner to report 1704 device-dependent...Message Structure and Code Assignment 1730 1731 The content of the STB message sent on DIO1 -6 is free to change 1732 between STB message transfers as...shall utilize DIO1 through DI08 to represent bits 2* through 2 q 1828 It is preferred that for a single data byte the data byte be - 1829 right justified
Hanford facility dangerous waste permit application, general information portion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hays, C.B.
1998-05-19
The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. Both the General Information and Unit-Specific portions of the Hanford Facility Dangerous Waste Permit Application address the content of the Part B permit application guidance prepared by the Washington State Department of Ecology (Ecology 1996) and the U.S. Environmental Protection Agency (40 Code of Federal Regulations 270), with additional information needed by the Hazardous and Solid Waste Amendments and revisions of Washington Administrative Code 173-303. Documentation contained in the General Information Portion ismore » broader in nature and could be used by multiple treatment, storage, and/or disposal units (e.g., the glossary provided in this report).« less
Kenne, Deric; Wolfram, Taylor M; Abram, Jenica K; Fleming, Michael
2016-01-01
Background Given the high penetration of social media use, social media has been proposed as a method for the dissemination of information to health professionals and patients. This study explored the potential for social media dissemination of the Academy of Nutrition and Dietetics Evidence-Based Nutrition Practice Guideline (EBNPG) for Heart Failure (HF). Objectives The objectives were to (1) describe the existing social media content on HF, including message content, source, and target audience, and (2) describe the attitude of physicians and registered dietitian nutritionists (RDNs) who care for outpatient HF patients toward the use of social media as a method to obtain information for themselves and to share this information with patients. Methods The methods were divided into 2 parts. Part 1 involved conducting a content analysis of tweets related to HF, which were downloaded from Twitonomy and assigned codes for message content (19 codes), source (9 codes), and target audience (9 codes); code frequency was described. A comparison in the popularity of tweets (those marked as favorites or retweeted) based on applied codes was made using t tests. Part 2 involved conducting phone interviews with RDNs and physicians to describe health professionals’ attitude toward the use of social media to communicate general health information and information specifically related to the HF EBNPG. Interviews were transcribed and coded; exemplar quotes representing frequent themes are presented. Results The sample included 294 original tweets with the hashtag “#heartfailure.” The most frequent message content codes were “HF awareness” (166/294, 56.5%) and “patient support” (97/294, 33.0%). The most frequent source codes were “professional, government, patient advocacy organization, or charity” (112/277, 40.4%) and “patient or family” (105/277, 37.9%). The most frequent target audience codes were “unable to identify” (111/277, 40.1%) and “other” (55/277, 19.9%). Significant differences were found in the popularity of tweets with (mean 1, SD 1.3 favorites) or without (mean 0.7, SD 1.3 favorites), the content code being “HF research” (P=.049). Tweets with the source code “professional, government, patient advocacy organizations, or charities” were significantly more likely to be marked as a favorite and retweeted than those without this source code (mean 1.2, SD 1.4 vs mean 0.8, SD 1.2, P=.03) and (mean 1.5, SD 1.8 vs mean 0.9, SD 2.0, P=.03). Interview participants believed that social media was a useful way to gather professional information. They did not believe that social media was useful for communicating with patients due to privacy concerns and the fact that the information had to be kept general rather than be tailored for a specific patient and the belief that their patients did not use social media or technology. Conclusions Existing Twitter content related to HF comes from a combination of patients and evidence-based organizations; however, there is little nutrition content. That gap may present an opportunity for EBNPG dissemination. Health professionals use social media to gather information for themselves but are skeptical of its value when communicating with patients, particularly due to privacy concerns and misconceptions about the characteristics of social media users. PMID:27847349
Hand, Rosa K; Kenne, Deric; Wolfram, Taylor M; Abram, Jenica K; Fleming, Michael
2016-11-15
Given the high penetration of social media use, social media has been proposed as a method for the dissemination of information to health professionals and patients. This study explored the potential for social media dissemination of the Academy of Nutrition and Dietetics Evidence-Based Nutrition Practice Guideline (EBNPG) for Heart Failure (HF). The objectives were to (1) describe the existing social media content on HF, including message content, source, and target audience, and (2) describe the attitude of physicians and registered dietitian nutritionists (RDNs) who care for outpatient HF patients toward the use of social media as a method to obtain information for themselves and to share this information with patients. The methods were divided into 2 parts. Part 1 involved conducting a content analysis of tweets related to HF, which were downloaded from Twitonomy and assigned codes for message content (19 codes), source (9 codes), and target audience (9 codes); code frequency was described. A comparison in the popularity of tweets (those marked as favorites or retweeted) based on applied codes was made using t tests. Part 2 involved conducting phone interviews with RDNs and physicians to describe health professionals' attitude toward the use of social media to communicate general health information and information specifically related to the HF EBNPG. Interviews were transcribed and coded; exemplar quotes representing frequent themes are presented. The sample included 294 original tweets with the hashtag "#heartfailure." The most frequent message content codes were "HF awareness" (166/294, 56.5%) and "patient support" (97/294, 33.0%). The most frequent source codes were "professional, government, patient advocacy organization, or charity" (112/277, 40.4%) and "patient or family" (105/277, 37.9%). The most frequent target audience codes were "unable to identify" (111/277, 40.1%) and "other" (55/277, 19.9%). Significant differences were found in the popularity of tweets with (mean 1, SD 1.3 favorites) or without (mean 0.7, SD 1.3 favorites), the content code being "HF research" (P=.049). Tweets with the source code "professional, government, patient advocacy organizations, or charities" were significantly more likely to be marked as a favorite and retweeted than those without this source code (mean 1.2, SD 1.4 vs mean 0.8, SD 1.2, P=.03) and (mean 1.5, SD 1.8 vs mean 0.9, SD 2.0, P=.03). Interview participants believed that social media was a useful way to gather professional information. They did not believe that social media was useful for communicating with patients due to privacy concerns and the fact that the information had to be kept general rather than be tailored for a specific patient and the belief that their patients did not use social media or technology. Existing Twitter content related to HF comes from a combination of patients and evidence-based organizations; however, there is little nutrition content. That gap may present an opportunity for EBNPG dissemination. Health professionals use social media to gather information for themselves but are skeptical of its value when communicating with patients, particularly due to privacy concerns and misconceptions about the characteristics of social media users. ©Rosa K Hand, Deric Kenne, Taylor M Wolfram, Jenica K Abram, Michael Fleming. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 15.11.2016.
Verification testing of the compression performance of the HEVC screen content coding extensions
NASA Astrophysics Data System (ADS)
Sullivan, Gary J.; Baroncini, Vittorio A.; Yu, Haoping; Joshi, Rajan L.; Liu, Shan; Xiu, Xiaoyu; Xu, Jizheng
2017-09-01
This paper reports on verification testing of the coding performance of the screen content coding (SCC) extensions of the High Efficiency Video Coding (HEVC) standard (Rec. ITU-T H.265 | ISO/IEC 23008-2 MPEG-H Part 2). The coding performance of HEVC screen content model (SCM) reference software is compared with that of the HEVC test model (HM) without the SCC extensions, as well as with the Advanced Video Coding (AVC) joint model (JM) reference software, for both lossy and mathematically lossless compression using All-Intra (AI), Random Access (RA), and Lowdelay B (LB) encoding structures and using similar encoding techniques. Video test sequences in 1920×1080 RGB 4:4:4, YCbCr 4:4:4, and YCbCr 4:2:0 colour sampling formats with 8 bits per sample are tested in two categories: "text and graphics with motion" (TGM) and "mixed" content. For lossless coding, the encodings are evaluated in terms of relative bit-rate savings. For lossy compression, subjective testing was conducted at 4 quality levels for each coding case, and the test results are presented through mean opinion score (MOS) curves. The relative coding performance is also evaluated in terms of Bjøntegaard-delta (BD) bit-rate savings for equal PSNR quality. The perceptual tests and objective metric measurements show a very substantial benefit in coding efficiency for the SCC extensions, and provided consistent results with a high degree of confidence. For TGM video, the estimated bit-rate savings ranged from 60-90% relative to the JM and 40-80% relative to the HM, depending on the AI/RA/LB configuration category and colour sampling format.
The complete chloroplast genome of Sinopodophyllum hexandrum (Berberidaceae).
Li, Huie; Guo, Qiqiang
2016-07-01
The complete chloroplast (cp) genome of the Sinopodophyllum hexandrum (Berberidaceae) was determined in this study. The circular genome is 157,940 bp in size, and comprises a pair of inverted repeat (IR) regions of 26,077 bp each, a large single-copy (LSC) region of 86,460 bp and a small single-copy (SSC) region of 19,326 bp. The GC content of the whole cp genome was 38.5%. A total of 133 genes were identified, including 88 protein-coding genes, 37 tRNA genes and eight rRNA genes. The whole cp genome consists of 114 unique genes, and 19 genes are duplicated in the IR regions. The phylogenetic analysis revealed that S. hexandrum is closely related to Nandina domestica within the family Berberidaceae.
The complete chloroplast genome of the Dendrobium strongylanthum (Orchidaceae: Epidendroideae).
Li, Jing; Chen, Chen; Wang, Zhe-Zhi
2016-07-01
Complete chloroplast genome sequence is very useful for studying the phylogenetic and evolution of species. In this study, the complete chloroplast genome of Dendrobium strongylanthum was constructed from whole-genome Illumina sequencing data. The chloroplast genome is 153 058 bp in length with 37.6% GC content and consists of two inverted repeats (IRs) of 26 316 bp. The IR regions are separated by large single-copy region (LSC, 85 836 bp) and small single-copy (SSC, 14 590 bp) region. A total of 130 chloroplast genes were successfully annotated, including 84 protein coding genes, 38 tRNA genes, and eight rRNA genes. Phylogenetic analyses showed that the chloroplast genome of Dendrobium strongylanthum is related to that of the Dendrobium officinal.
Complete sequence and comparative analysis of the chloroplast genome of Plinia trunciflora
Eguiluz, Maria; Yuyama, Priscila Mary; Guzman, Frank; Rodrigues, Nureyev Ferreira; Margis, Rogerio
2017-01-01
Abstract Plinia trunciflora is a Brazilian native fruit tree from the Myrtaceae family, also known as jaboticaba. This species has great potential by its fruit production. Due to the high content of essential oils in their leaves and of anthocyanins in the fruits, there is also an increasing interest by the pharmaceutical industry. Nevertheless, there are few studies focusing on its molecular biology and genetic characterization. We herein report the complete chloroplast (cp) genome of P. trunciflora using high-throughput sequencing and compare it to other previously sequenced Myrtaceae genomes. The cp genome of P. trunciflora is 159,512 bp in size, comprising inverted repeats of 26,414 bp and single-copy regions of 88,097 bp (LSC) and 18,587 bp (SSC). The genome contains 111 single-copy genes (77 protein-coding, 30 tRNA and four rRNA genes). Phylogenetic analysis using 57 cp protein-coding genes demonstrated that P. trunciflora, Eugenia uniflora and Acca sellowiana form a cluster with closer relationship to Syzygium cumini than with Eucalyptus. The complete cp sequence reported here can be used in evolutionary and population genetics studies, contributing to resolve the complex taxonomy of this species and fill the gap in genetic characterization. PMID:29111566
NASA Technical Reports Server (NTRS)
Chau, Jessica Furrer; Or, Dani; Sukop, Michael C.; Steinberg, S. L. (Principal Investigator)
2005-01-01
Liquid distributions in unsaturated porous media under different gravitational accelerations and corresponding macroscopic gaseous diffusion coefficients were investigated to enhance understanding of plant growth conditions in microgravity. We used a single-component, multiphase lattice Boltzmann code to simulate liquid configurations in two-dimensional porous media at varying water contents for different gravity conditions and measured gas diffusion through the media using a multicomponent lattice Boltzmann code. The relative diffusion coefficients (D rel) for simulations with and without gravity as functions of air-filled porosity were in good agreement with measured data and established models. We found significant differences in liquid configuration in porous media, leading to reductions in D rel of up to 25% under zero gravity. The study highlights potential applications of the lattice Boltzmann method for rapid and cost-effective evaluation of alternative plant growth media designs under variable gravity.
SIMINOFF, LAURA A.; STEP, MARY M.
2011-01-01
Many observational coding schemes have been offered to measure communication in health care settings. These schemes fall short of capturing multiple functions of communication among providers, patients, and other participants. After a brief review of observational communication coding, the authors present a comprehensive scheme for coding communication that is (a) grounded in communication theory, (b) accounts for instrumental and relational communication, and (c) captures important contextual features with tailored coding templates: the Siminoff Communication Content & Affect Program (SCCAP). To test SCCAP reliability and validity, the authors coded data from two communication studies. The SCCAP provided reliable measurement of communication variables including tailored content areas and observer ratings of speaker immediacy, affiliation, confirmation, and disconfirmation behaviors. PMID:21213170
Bohlin, Jon; Eldholm, Vegard; Pettersson, John H O; Brynildsrud, Ola; Snipen, Lars
2017-02-10
The core genome consists of genes shared by the vast majority of a species and is therefore assumed to have been subjected to substantially stronger purifying selection than the more mobile elements of the genome, also known as the accessory genome. Here we examine intragenic base composition differences in core genomes and corresponding accessory genomes in 36 species, represented by the genomes of 731 bacterial strains, to assess the impact of selective forces on base composition in microbes. We also explore, in turn, how these results compare with findings for whole genome intragenic regions. We found that GC content in coding regions is significantly higher in core genomes than accessory genomes and whole genomes. Likewise, GC content variation within coding regions was significantly lower in core genomes than in accessory genomes and whole genomes. Relative entropy in coding regions, measured as the difference between observed and expected trinucleotide frequencies estimated from mononucleotide frequencies, was significantly higher in the core genomes than in accessory and whole genomes. Relative entropy was positively associated with coding region GC content within the accessory genomes, but not within the corresponding coding regions of core or whole genomes. The higher intragenic GC content and relative entropy, as well as the lower GC content variation, observed in the core genomes is most likely associated with selective constraints. It is unclear whether the positive association between GC content and relative entropy in the more mobile accessory genomes constitutes signatures of selection or selective neutral processes.
Babor, Thomas F; Xuan, Ziming; Damon, Donna
2013-10-01
This study evaluated the use of a modified Delphi technique in combination with a previously developed alcohol advertising rating procedure to detect content violations in the U.S. Beer Institute Code. A related aim was to estimate the minimum number of raters needed to obtain reliable evaluations of code violations in television commercials. Six alcohol ads selected for their likelihood of having code violations were rated by community and expert participants (N = 286). Quantitative rating scales were used to measure the content of alcohol advertisements based on alcohol industry self-regulatory guidelines. The community group participants represented vulnerability characteristics that industry codes were designed to protect (e.g., age <21); experts represented various health-related professions, including public health, human development, alcohol research, and mental health. Alcohol ads were rated on 2 occasions separated by 1 month. After completing Time 1 ratings, participants were randomized to receive feedback from 1 group or the other. Findings indicate that (i) ratings at Time 2 had generally reduced variance, suggesting greater consensus after feedback, (ii) feedback from the expert group was more influential than that of the community group in developing group consensus, (iii) the expert group found significantly fewer violations than the community group, (iv) experts representing different professional backgrounds did not differ among themselves in the number of violations identified, and (v) a rating panel composed of at least 15 raters is sufficient to obtain reliable estimates of code violations. The Delphi technique facilitates consensus development around code violations in alcohol ad content and may enhance the ability of regulatory agencies to monitor the content of alcoholic beverage advertising when combined with psychometric-based rating procedures. Copyright © 2013 by the Research Society on Alcoholism.
Babor, Thomas F.; Xuan, Ziming; Damon, Donna
2013-01-01
Background This study evaluated the use of a modified Delphi technique in combination with a previously developed alcohol advertising rating procedure to detect content violations in the US Beer Institute code. A related aim was to estimate the minimum number of raters needed to obtain reliable evaluations of code violations in television commercials. Methods Six alcohol ads selected for their likelihood of having code violations were rated by community and expert participants (N=286). Quantitative rating scales were used to measure the content of alcohol advertisements based on alcohol industry self-regulatory guidelines. The community group participants represented vulnerability characteristics that industry codes were designed to protect (e.g., age < 21); experts represented various health-related professions, including public health, human development, alcohol research and mental health. Alcohol ads were rated on two occasions separated by one month. After completing Time 1 ratings, participants were randomized to receive feedback from one group or the other. Results Findings indicate that (1) ratings at Time 2 had generally reduced variance, suggesting greater consensus after feedback, (2) feedback from the expert group was more influential than that of the community group in developing group consensus, (3) the expert group found significantly fewer violations than the community group, (4) experts representing different professional backgrounds did not differ among themselves in the number of violations identified; (5) a rating panel composed of at least 15 raters is sufficient to obtain reliable estimates of code violations. Conclusions The Delphi Technique facilitates consensus development around code violations in alcohol ad content and may enhance the ability of regulatory agencies to monitor the content of alcoholic beverage advertising when combined with psychometric-based rating procedures. PMID:23682927
ERIC Educational Resources Information Center
Hau, Goh Bak; Siraj, Saedah; Alias, Norlidah; Rauf, Rose Amnah Abd.; Zakaria, Abd. Razak; Darusalam, Ghazali
2013-01-01
This study provides a content analysis of selected articles in the field of QR code and its application in educational context that were published in journals and proceedings of international conferences and workshops from 2006 to 2011. These articles were cross analysed by published years, journal, and research topics. Further analysis was…
Ion channeling study of defects in compound crystals using Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Turos, A.; Jozwik, P.; Nowicki, L.; Sathish, N.
2014-08-01
Ion channeling is a well-established technique for determination of structural properties of crystalline materials. Defect depth profiles have been usually determined basing on the two-beam model developed by Bøgh (1968) [1]. As long as the main research interest was focused on single element crystals it was considered as sufficiently accurate. New challenge emerged with growing technological importance of compound single crystals and epitaxial heterostructures. Overlap of partial spectra due to different sublattices and formation of complicated defect structures makes the two beam method hardly applicable. The solution is provided by Monte Carlo computer simulations. Our paper reviews principal aspects of this approach and the recent developments in the McChasy simulation code. The latter made it possible to distinguish between randomly displaced atoms (RDA) and extended defects (dislocations, loops, etc.). Hence, complex defect structures can be characterized by the relative content of these two components. The next refinement of the code consists of detailed parameterization of dislocations and dislocation loops. Defect profiles for variety of compound crystals (GaN, ZnO, SrTiO3) have been measured and evaluated using the McChasy code. Damage accumulation curves for RDA and extended defects revealed non monotonous defect buildup with some characteristic steps. Transition to each stage is governed by the different driving force. As shown by the complementary high resolution XRD measurements lattice strain plays here the crucial role and can be correlated with the concentration of extended defects.
Walker, Joseph F; Zanis, Michael J; Emery, Nancy C
2014-04-01
Complete chloroplast genome studies can help resolve relationships among large, complex plant lineages such as Asteraceae. We present the first whole plastome from the Madieae tribe and compare its sequence variation to other chloroplast genomes in Asteraceae. We used high throughput sequencing to obtain the Lasthenia burkei chloroplast genome. We compared sequence structure and rates of molecular evolution in the small single copy (SSC), large single copy (LSC), and inverted repeat (IR) regions to those for eight Asteraceae accessions and one Solanaceae accession. The chloroplast sequence of L. burkei is 150 746 bp and contains 81 unique protein coding genes and 4 coding ribosomal RNA sequences. We identified three major inversions in the L. burkei chloroplast, all of which have been found in other Asteraceae lineages, and a previously unreported inversion in Lactuca sativa. Regions flanking inversions contained tRNA sequences, but did not have particularly high G + C content. Substitution rates varied among the SSC, LSC, and IR regions, and rates of evolution within each region varied among species. Some observed differences in rates of molecular evolution may be explained by the relative proportion of coding to noncoding sequence within regions. Rates of molecular evolution vary substantially within and among chloroplast genomes, and major inversion events may be promoted by the presence of tRNAs. Collectively, these results provide insight into different mechanisms that may promote intramolecular recombination and the inversion of large genomic regions in the plastome.
Code of Federal Regulations, 2010 CFR
2010-04-01
... CONSUMPTION INFANT FORMULA QUALITY CONTROL PROCEDURES Quality Control Procedures for Assuring Nutrient Content of Infant Formulas § 106.90 Coding. The manufacturer shall code all infant formulas in conformity...
Content-Based Multi-Channel Network Coding Algorithm in the Millimeter-Wave Sensor Network
Lin, Kai; Wang, Di; Hu, Long
2016-01-01
With the development of wireless technology, the widespread use of 5G is already an irreversible trend, and millimeter-wave sensor networks are becoming more and more common. However, due to the high degree of complexity and bandwidth bottlenecks, the millimeter-wave sensor network still faces numerous problems. In this paper, we propose a novel content-based multi-channel network coding algorithm, which uses the functions of data fusion, multi-channel and network coding to improve the data transmission; the algorithm is referred to as content-based multi-channel network coding (CMNC). The CMNC algorithm provides a fusion-driven model based on the Dempster-Shafer (D-S) evidence theory to classify the sensor nodes into different classes according to the data content. By using the result of the classification, the CMNC algorithm also provides the channel assignment strategy and uses network coding to further improve the quality of data transmission in the millimeter-wave sensor network. Extensive simulations are carried out and compared to other methods. Our simulation results show that the proposed CMNC algorithm can effectively improve the quality of data transmission and has better performance than the compared methods. PMID:27376302
Proposal for a new content model for the Austrian Procedure Catalogue.
Neururer, Sabrina B; Pfeiffer, Karl P
2013-01-01
The Austrian Procedure Catalogue is used for procedure coding in Austria. Its architecture and content has some major weaknesses. The aim of this study is the presentation of a new potential content model for this classification system consisting of main characteristics of health interventions. It is visualized using a UML class diagram. Based on this proposition, an implementation of an ontology for procedure coding is planned.
Toward Developing a Universal Code of Ethics for Adult Educators.
ERIC Educational Resources Information Center
Siegel, Irwin H.
2000-01-01
Presents conflicting viewpoints on a universal code of ethics for adult educators. Suggests objectives of a code (guidance for practice, policymaking direction, common reference point, shared values). Outlines content and methods for implementing a code. (SK)
Prediction of Acoustic Loads Generated by Propulsion Systems
NASA Technical Reports Server (NTRS)
Perez, Linamaria; Allgood, Daniel C.
2011-01-01
NASA Stennis Space Center is one of the nation's premier facilities for conducting large-scale rocket engine testing. As liquid rocket engines vary in size, so do the acoustic loads that they produce. When these acoustic loads reach very high levels they may cause damages both to humans and to actual structures surrounding the testing area. To prevent these damages, prediction tools are used to estimate the spectral content and levels of the acoustics being generated by the rocket engine plumes and model their propagation through the surrounding atmosphere. Prior to the current work, two different acoustic prediction tools were being implemented at Stennis Space Center, each having their own advantages and disadvantages depending on the application. Therefore, a new prediction tool was created, using NASA SP-8072 handbook as a guide, which would replicate the same prediction methods as the previous codes, but eliminate any of the drawbacks the individual codes had. Aside from replicating the previous modeling capability in a single framework, additional modeling functions were added thereby expanding the current modeling capability. To verify that the new code could reproduce the same predictions as the previous codes, two verification test cases were defined. These verification test cases also served as validation cases as the predicted results were compared to actual test data.
Program ratings do not predict negative content in commercials on children's channels.
Dale, Lourdes P; Klein, Jordana; DiLoreto, James; Pidano, Anne E; Borto, Jolanta W; McDonald, Kathleen; Olson, Heather; Neace, William P
2011-01-01
The aim of this study was to determine the presence of negative content in commercials airing on 3 children's channels (Disney Channel, Nickelodeon, and Cartoon Network). The 1681 commercials were coded with a reliable coding system and content comparisons were made. Although the majority of the commercials were coded as neutral, negative content was present in 13.5% of commercials. This rate was significantly more than the predicted value of zero and more similar to the rates cited in previous research examining content during sporting events. The rate of negative content was less than, but not significantly different from, the rate of positive content. Thus, our findings did not support our hypothesis that there would be more commercials with positive content than with negative content. Logistic regression analysis indicated that channel, and not rating, was a better predictor of the presence of overall negative content and the presence of violent behaviors. Commercials airing on the Cartoon Network had significantly more negative content, and those airing on Disney Channel had significantly less negative content than the other channels. Within the individual channels, program ratings did not relate to the presence of negative content. Parents cannot assume the content of commercials will be consistent with the program rating or label. Pediatricians and psychologists should educate parents about the potential for negative content in commercials and advocate for a commercials rating system to ensure that there is greater parity between children's programs and the corresponding commercials.
The "Motherese" of Mr. Rogers: A Description of the Dialogue of Educational Television Programs.
ERIC Educational Resources Information Center
Rice, Mabel L.; Haight, Patti L.
Dialogue from 30-minute samples from "Sesame Street" and "Mr. Rogers' Neighborhood" was coded for grammar, content, and discourse. Grammatical analysis used the LINGQUEST computer-assisted language assessment program (Mordecai, Palen, and Palmer 1982). Content coding was based on categories developed by Rice (1984) and…
Prescription Drug Abuse Information in D.A.R.E.
ERIC Educational Resources Information Center
Morris, Melissa C.; Cline, Rebecca J. Welch; Weiler, Robert M.; Broadway, S. Camille
2006-01-01
This investigation was designed to examine prescription drug-related content and learning objectives in Drug Abuse Resistance Education (D.A.R.E.) for upper elementary and middle schools. Specific prescription-drug topics and context associated with content and objectives were coded. The coding system for topics included 126 topics organized…
Analyzing Prosocial Content on T.V.
ERIC Educational Resources Information Center
Davidson, Emily S.; Neale, John M.
To enhance knowledge of television content, a prosocial code was developed by watching a large number of potentially prosocial television programs and making notes on all the positive acts. The behaviors were classified into a workable number of categories. The prosocial code is largely verbal and contains seven categories which fall into two…
Code of Federal Regulations, 2010 CFR
2010-04-01
... 25 Indians 1 2010-04-01 2010-04-01 false May a tribe create and adopt a single heir rule without adopting a tribal probate code? 18.301 Section 18.301 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR PROBATE TRIBAL PROBATE CODES Approval of Single Heir Rule § 18.301 May a tribe create and adopt a...
Wong, Alex W K; Lau, Stephen C L; Fong, Mandy W M; Cella, David; Lai, Jin-Shei; Heinemann, Allen W
2018-04-03
To determine the extent to which the content of the Quality of Life in Neurological Disorders (Neuro-QoL) covers the International Classification of Functioning, Disability and Health (ICF) Core Sets for multiple sclerosis (MS), stroke, spinal cord injury (SCI), and traumatic brain injury (TBI) using summary linkage indicators. Content analysis by linking content of the Neuro-QoL to corresponding ICF codes of each Core Set for MS, stroke, SCI, and TBI. Three academic centers. None. None. Four summary linkage indicators proposed by MacDermid et al were estimated to compare the content coverage between Neuro-QoL and the ICF codes of Core Sets for MS, stroke, MS, and TBI. Neuro-QoL represented 20% to 30% Core Set codes for different conditions in which more codes in Core Sets for MS (29%), stroke (28%), and TBI (28%) were covered than those for SCI in the long-term (20%) and early postacute (19%) contexts. Neuro-QoL represented nearly half of the unique Activity and Participation codes (43%-49%) and less than one third of the unique Body Function codes (12%-32%). It represented fewer Environmental Factors codes (2%-6%) and no Body Structures codes. Absolute linkage indicators found that at least 60% of Neuro-QoL items were linked to Core Set codes (63%-95%), but many items covered the same codes as revealed by unique linkage indicators (7%-13%), suggesting high concept redundancy among items. The Neuro-QoL links more closely to ICF Core Sets for stroke, MS, and TBI than to those for SCI, and primarily covers activity and participation ICF domains. Other instruments are needed to address concepts not measured by the Neuro-QoL when a comprehensive health assessment is needed. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Does the Genetic Code Have A Eukaryotic Origin?
Zhang, Zhang; Yu, Jun
2013-01-01
In the RNA world, RNA is assumed to be the dominant macromolecule performing most, if not all, core “house-keeping” functions. The ribo-cell hypothesis suggests that the genetic code and the translation machinery may both be born of the RNA world, and the introduction of DNA to ribo-cells may take over the informational role of RNA gradually, such as a mature set of genetic code and mechanism enabling stable inheritance of sequence and its variation. In this context, we modeled the genetic code in two content variables—GC and purine contents—of protein-coding sequences and measured the purine content sensitivities for each codon when the sensitivity (% usage) is plotted as a function of GC content variation. The analysis leads to a new pattern—the symmetric pattern—where the sensitivity of purine content variation shows diagonally symmetry in the codon table more significantly in the two GC content invariable quarters in addition to the two existing patterns where the table is divided into either four GC content sensitivity quarters or two amino acid diversity halves. The most insensitive codon sets are GUN (valine) and CAN (CAR for asparagine and CAY for aspartic acid) and the most biased amino acid is valine (always over-estimated) followed by alanine (always under-estimated). The unique position of valine and its codons suggests its key roles in the final recruitment of the complete codon set of the canonical table. The distinct choice may only be attributable to sequence signatures or signals of splice sites for spliceosomal introns shared by all extant eukaryotes. PMID:23402863
Berkowitz, Seth A; Eisenstat, Stephanie A; Barnard, Lily S; Wexler, Deborah J
2018-06-01
To explore the patient perspective on coordinated multidisciplinary diabetes team care among a socioeconomically diverse group of adults with type 2 diabetes. Qualitative research design using 8 focus groups (n=53). We randomly sampled primary care patients with type 2 diabetes and conducted focus groups at their primary care clinic. Discussion prompts queried current perceptions of team care. Each focus group was audio recorded, transcribed verbatim, and independently coded by three reviewers. Coding used an iterative process. Thematic saturation was achieved. Data were analyzed using content analysis. Most participants believed that coordinated multidisciplinary diabetes team care was a good approach, feeling that diabetes was too complicated for any one care team member to manage. Primary care physicians were seen as too busy to manage diabetes alone, and participants were content to be treated by other care team members, especially if there was a single point of contact and the care was coordinated. Participants suggested that an ideal multidisciplinary approach would additionally include support for exercise and managing socioeconomic challenges, components perceived to be missing from the existing approach to diabetes care. Coordinated, multidisciplinary diabetes team care is understood by and acceptable to patients with type 2 diabetes. Copyright © 2018 Primary Care Diabetes Europe. Published by Elsevier Ltd. All rights reserved.
[Preliminarily application of content analysis to qualitative nursing data].
Liang, Shu-Yuan; Chuang, Yeu-Hui; Wu, Shu-Fang
2012-10-01
Content analysis is a methodology for objectively and systematically studying the content of communication in various formats. Content analysis in nursing research and nursing education is called qualitative content analysis. Qualitative content analysis is frequently applied to nursing research, as it allows researchers to determine categories inductively and deductively. This article examines qualitative content analysis in nursing research from theoretical and practical perspectives. We first describe how content analysis concepts such as unit of analysis, meaning unit, code, category, and theme are used. Next, we describe the basic steps involved in using content analysis, including data preparation, data familiarization, analysis unit identification, creating tentative coding categories, category refinement, and establishing category integrity. Finally, this paper introduces the concept of content analysis rigor, including dependability, confirmability, credibility, and transferability. This article elucidates the content analysis method in order to help professionals conduct systematic research that generates data that are informative and useful in practical application.
The complete chloroplast genome of North American ginseng, Panax quinquefolius.
Han, Zeng-Jie; Li, Wei; Liu, Yuan; Gao, Li-Zhi
2016-09-01
We report complete nucleotide sequence of the Panax quinquefolius chloroplast genome using next-generation sequencing technology. The genome size is 156 359 bp, including two inverted repeats (IRs) of 52 153 bp, separated by the large single-copy (LSC 86 184 bp) and small single-copy (SSC 18 081 bp) regions. This cp genome encodes 114 unigenes (80 protein-coding genes, four rRNA genes, and 30 tRNA genes), in which 18 are duplicated in the IR regions. Overall GC content of the genome is 38.08%. A phylogenomic analysis of the 10 complete chloroplast genomes from Araliaceae using Daucus carota from Apiaceae as outgroup showed that P. quinquefolius is closely related to the other two members of the genus Panax, P. ginseng and P. notoginseng.
The complete chloroplast genome sequence of Chikusichloa aquatica (Poaceae: Oryzeae).
Zhang, Jie; Zhang, Dan; Shi, Chao; Gao, Ju; Gao, Li-Zhi
2016-07-01
The complete chloroplast sequence of the Chikusichloa aquatica was determined in this study. The genome consists of 136 563 bp containing a pair of inverted repeats (IRs) of 20 837 bp, which was separated by a large single-copy region and a small single-copy region of 82 315 bp and 33 411 bp, respectively. The C. aquatica cp genome encodes 111 functional genes (71 protein-coding genes, four rRNA genes, and 36 tRNA genes): 92 are unique, while 19 are duplicated in the IR regions. The genic regions account for 58.9% of whole cp genome, and the GC content of the plastome is 39.0%. A phylogenomic analysis showed that C. aquatica is closely related to Rhynchoryza subulata that belongs to the tribe Oryzeae.
Information quality measurement of medical encoding support based on usability.
Puentes, John; Montagner, Julien; Lecornu, Laurent; Cauvin, Jean-Michel
2013-12-01
Medical encoding support systems for diagnoses and medical procedures are an emerging technology that begins to play a key role in billing, reimbursement, and health policies decisions. A significant problem to exploit these systems is how to measure the appropriateness of any automatically generated list of codes, in terms of fitness for use, i.e. their quality. Until now, only information retrieval performance measurements have been applied to estimate the accuracy of codes lists as quality indicator. Such measurements do not give the value of codes lists for practical medical encoding, and cannot be used to globally compare the quality of multiple codes lists. This paper defines and validates a new encoding information quality measure that addresses the problem of measuring medical codes lists quality. It is based on a usability study of how expert coders and physicians apply computer-assisted medical encoding. The proposed measure, named ADN, evaluates codes Accuracy, Dispersion and Noise, and is adapted to the variable length and content of generated codes lists, coping with limitations of previous measures. According to the ADN measure, the information quality of a codes list is fully represented by a single point, within a suitably constrained feature space. Using one scheme, our approach is reliable to measure and compare the information quality of hundreds of codes lists, showing their practical value for medical encoding. Its pertinence is demonstrated by simulation and application to real data corresponding to 502 inpatient stays in four clinic departments. Results are compared to the consensus of three expert coders who also coded this anonymized database of discharge summaries, and to five information retrieval measures. Information quality assessment applying the ADN measure showed the degree of encoding-support system variability from one clinic department to another, providing a global evaluation of quality measurement trends. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Babor, Thomas F; Xuan, Ziming; Proctor, Dwayne
2008-03-01
The purposes of this study were to develop reliable procedures to monitor the content of alcohol advertisements broadcast on television and in other media, and to detect violations of the content guidelines of the alcohol industry's self-regulation codes. A set of rating-scale items was developed to measure the content guidelines of the 1997 version of the U.S. Beer Institute Code. Six focus groups were conducted with 60 college students to evaluate the face validity of the items and the feasibility of the procedure. A test-retest reliability study was then conducted with 74 participants, who rated five alcohol advertisements on two occasions separated by 1 week. Average correlations across all advertisements using three reliability statistics (r, rho, and kappa) were almost all statistically significant and the kappas were good for most items, which indicated high test-retest agreement. We also found high interrater reliabilities (intraclass correlations) among raters for item-level and guideline-level violations, indicating that regardless of the specific item, raters were consistent in their general evaluations of the advertisements. Naïve (untrained) raters can provide consistent (reliable) ratings of the main content guidelines proposed in the U.S. Beer Institute Code. The rating procedure may have future applications for monitoring compliance with industry self-regulation codes and for conducting research on the ways in which alcohol advertisements are perceived by young adults and other vulnerable populations.
Assessment of Self-Regulatory Code Violations in Brazilian Television Beer Advertisements*
Vendrame, Alan; Pinsky, Ilana; Souza E Silva, Rebeca; Babor, Thomas
2010-01-01
Objective: Research suggests that alcoholic beverage advertisements may have an adverse effect on teenagers and young adults, owing to their vulnerability to suggestive message content. This study was designed to evaluate perceived violations of the content guidelines of the Brazilian alcohol marketing self-regulation code, based on ratings of the five most popular beer advertisements broadcast on television in the summer of 2005–2006 and during the 2006 FIFA (Fédération Internationale de Football Association) World Cup games. Method: Five beer advertisements were selected from a previous study showing that they were perceived to be highly appealing to a sample of Brazilian teenagers. These advertisements were evaluated by a sample of Brazilian high school students using a rating procedure designed to measure the content of alcohol advertisements covered in industry self-regulation codes. Results: All five advertisements were found to violate multiple guidelines of the Brazilian code of marketing self-regulation. The advertisement with the greatest number of violations was Antarctica's “Male Repellent,” which was perceived to violate 11 of the 16 guidelines in the code. Two advertisements had nine violations, and one had eight. The guidelines most likely to be violated by these advertisements were Guideline 1, which is aimed at protecting children and teenagers, and Guideline 2, which prohibits content encouraging excessive and irresponsible alcoholic beverage consumption. Conclusions: The five beer advertisements rated as most appealing to Brazilian teenagers were perceived by a sample of the same population to have violated numerous principles of the Brazilian self-regulation code governing the marketing of alcoholic beverages. Because of these numerous perceived code violations, it now seems important for regulatory authorities to submit industry marketing content to more systematic evaluation by young people and public health experts and for researchers to focus more on the ways in which alcohol advertising influences early onset of drinking and excessive alcohol consumption. PMID:20409439
Assessment of self-regulatory code violations in Brazilian television beer advertisements.
Vendrame, Alan; Pinsky, Ilana; e Silva, Rebeca Souza; Babor, Thomas
2010-05-01
Research suggests that alcoholic beverage advertisements may have an adverse effect on teenagers and young adults, owing to their vulnerability to suggestive message content. This study was designed to evaluate perceived violations of the content guidelines of the Brazilian alcohol marketing self-regulation code, based on ratings of the five most popular beer advertisements broadcast on television in the summer of 2005-2006 and during the 2006 FIFA (Federation Internationale de Football Association) World Cup games. Five beer advertisements were selected from a previous study showing that they were perceived to be highly appealing to a sample of Brazilian teenagers. These advertisements were evaluated by a sample of Brazilian high school students using a rating procedure designed to measure the content of alcohol advertisements covered in industry self-regulation codes. All five advertisements were found to violate multiple guidelines of the Brazilian code of marketing self-regulation. The advertisement with the greatest number of violations was Antarctica's "Male Repellent," which was perceived to violate 11 of the 16 guidelines in the code. Two advertisements had nine violations, and one had eight. The guidelines most likely to be violated by these advertisements were Guideline 1, which is aimed at protecting children and teenagers, and Guideline 2, which prohibits content encouraging excessive and irresponsible alcoholic beverage consumption. The five beer advertisements rated as most appealing to Brazilian teenagers were perceived by a sample of the same population to have violated numerous principles of the Brazilian self-regulation code governing the marketing of alcoholic beverages. Because of these numerous perceived code violations, it now seems important for regulatory authorities to submit industry marketing content to more systematic evaluation by young people and public health experts and for researchers to focus more on the ways in which alcohol advertising influences early onset of drinking and excessive alcohol consumption.
Coding stimulus amplitude by correlated neural activity
NASA Astrophysics Data System (ADS)
Metzen, Michael G.; Ávila-Åkerberg, Oscar; Chacron, Maurice J.
2015-04-01
While correlated activity is observed ubiquitously in the brain, its role in neural coding has remained controversial. Recent experimental results have demonstrated that correlated but not single-neuron activity can encode the detailed time course of the instantaneous amplitude (i.e., envelope) of a stimulus. These have furthermore demonstrated that such coding required and was optimal for a nonzero level of neural variability. However, a theoretical understanding of these results is still lacking. Here we provide a comprehensive theoretical framework explaining these experimental findings. Specifically, we use linear response theory to derive an expression relating the correlation coefficient to the instantaneous stimulus amplitude, which takes into account key single-neuron properties such as firing rate and variability as quantified by the coefficient of variation. The theoretical prediction was in excellent agreement with numerical simulations of various integrate-and-fire type neuron models for various parameter values. Further, we demonstrate a form of stochastic resonance as optimal coding of stimulus variance by correlated activity occurs for a nonzero value of noise intensity. Thus, our results provide a theoretical explanation of the phenomenon by which correlated but not single-neuron activity can code for stimulus amplitude and how key single-neuron properties such as firing rate and variability influence such coding. Correlation coding by correlated but not single-neuron activity is thus predicted to be a ubiquitous feature of sensory processing for neurons responding to weak input.
Augmented burst-error correction for UNICON laser memory. [digital memory
NASA Technical Reports Server (NTRS)
Lim, R. S.
1974-01-01
A single-burst-error correction system is described for data stored in the UNICON laser memory. In the proposed system, a long fire code with code length n greater than 16,768 bits was used as an outer code to augment an existing inner shorter fire code for burst error corrections. The inner fire code is a (80,64) code shortened from the (630,614) code, and it is used to correct a single-burst-error on a per-word basis with burst length b less than or equal to 6. The outer code, with b less than or equal to 12, would be used to correct a single-burst-error on a per-page basis, where a page consists of 512 32-bit words. In the proposed system, the encoding and error detection processes are implemented by hardware. A minicomputer, currently used as a UNICON memory management processor, is used on a time-demanding basis for error correction. Based upon existing error statistics, this combination of an inner code and an outer code would enable the UNICON system to obtain a very low error rate in spite of flaws affecting the recorded data.
Wang, Jiajia; Li, Hu; Dai, Renhuai
2017-12-01
Here, we describe the first complete mitochondrial genome (mitogenome) sequence of the leafhopper Taharana fasciana (Coelidiinae). The mitogenome sequence contains 15,161 bp with an A + T content of 77.9%. It includes 13 protein-coding genes, two ribosomal RNA genes, 22 transfer RNA genes, and one non-coding (A + T-rich) region; in addition, a repeat region is also present (GenBank accession no. KY886913). These genes/regions are in the same order as in the inferred insect ancestral mitogenome. All protein-coding genes have ATN as the start codon, and TAA or single T as the stop codons, except the gene ND3, which ends with TAG. Furthermore, we predicted the secondary structures of the rRNAs in T. fasciana. Six domains (domain III is absent in arthropods) and 41 helices were predicted for 16S rRNA, and 12S rRNA comprised three structural domains and 24 helices. Phylogenetic tree analysis confirmed that T. fasciana and other members of the Cicadellidae are clustered into a clade, and it identified the relationships among the subfamilies Deltocephalinae, Coelidiinae, Idiocerinae, Cicadellinae, and Typhlocybinae.
Chroma sampling and modulation techniques in high dynamic range video coding
NASA Astrophysics Data System (ADS)
Dai, Wei; Krishnan, Madhu; Topiwala, Pankaj
2015-09-01
High Dynamic Range and Wide Color Gamut (HDR/WCG) Video Coding is an area of intense research interest in the engineering community, for potential near-term deployment in the marketplace. HDR greatly enhances the dynamic range of video content (up to 10,000 nits), as well as broadens the chroma representation (BT.2020). The resulting content offers new challenges in its coding and transmission. The Moving Picture Experts Group (MPEG) of the International Standards Organization (ISO) is currently exploring coding efficiency and/or the functionality enhancements of the recently developed HEVC video standard for HDR and WCG content. FastVDO has developed an advanced approach to coding HDR video, based on splitting the HDR signal into a smoothed luminance (SL) signal, and an associated base signal (B). Both signals are then chroma downsampled to YFbFr 4:2:0 signals, using advanced resampling filters, and coded using the Main10 High Efficiency Video Coding (HEVC) standard, which has been developed jointly by ISO/IEC MPEG and ITU-T WP3/16 (VCEG). Our proposal offers both efficient coding, and backwards compatibility with the existing HEVC Main10 Profile. That is, an existing Main10 decoder can produce a viewable standard dynamic range video, suitable for existing screens. Subjective tests show visible improvement over the anchors. Objective tests show a sizable gain of over 25% in PSNR (RGB domain) on average, for a key set of test clips selected by the ISO/MPEG committee.
Coding For Compression Of Low-Entropy Data
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu
1994-01-01
Improved method of encoding digital data provides for efficient lossless compression of partially or even mostly redundant data from low-information-content source. Method of coding implemented in relatively simple, high-speed arithmetic and logic circuits. Also increases coding efficiency beyond that of established Huffman coding method in that average number of bits per code symbol can be less than 1, which is the lower bound for Huffman code.
Fortin, Connor H; Schulze, Katharina V; Babbitt, Gregory A
2015-01-01
It is now widely-accepted that DNA sequences defining DNA-protein interactions functionally depend upon local biophysical features of DNA backbone that are important in defining sites of binding interaction in the genome (e.g. DNA shape, charge and intrinsic dynamics). However, these physical features of DNA polymer are not directly apparent when analyzing and viewing Shannon information content calculated at single nucleobases in a traditional sequence logo plot. Thus, sequence logos plots are severely limited in that they convey no explicit information regarding the structural dynamics of DNA backbone, a feature often critical to binding specificity. We present TRX-LOGOS, an R software package and Perl wrapper code that interfaces the JASPAR database for computational regulatory genomics. TRX-LOGOS extends the traditional sequence logo plot to include Shannon information content calculated with regard to the dinucleotide-based BI-BII conformation shifts in phosphate linkages on the DNA backbone, thereby adding a visual measure of intrinsic DNA flexibility that can be critical for many DNA-protein interactions. TRX-LOGOS is available as an R graphics module offered at both SourceForge and as a download supplement at this journal. To demonstrate the general utility of TRX logo plots, we first calculated the information content for 416 Saccharomyces cerevisiae transcription factor binding sites functionally confirmed in the Yeastract database and matched to previously published yeast genomic alignments. We discovered that flanking regions contain significantly elevated information content at phosphate linkages than can be observed at nucleobases. We also examined broader transcription factor classifications defined by the JASPAR database, and discovered that many general signatures of transcription factor binding are locally more information rich at the level of DNA backbone dynamics than nucleobase sequence. We used TRX-logos in combination with MEGA 6.0 software for molecular evolutionary genetics analysis to visually compare the human Forkhead box/FOX protein evolution to its binding site evolution. We also compared the DNA binding signatures of human TP53 tumor suppressor determined by two different laboratory methods (SELEX and ChIP-seq). Further analysis of the entire yeast genome, center aligned at the start codon, also revealed a distinct sequence-independent 3 bp periodic pattern in information content, present only in coding region, and perhaps indicative of the non-random organization of the genetic code. TRX-LOGOS is useful in any situation in which important information content in DNA can be better visualized at the positions of phosphate linkages (i.e. dinucleotides) where the dynamic properties of the DNA backbone functions to facilitate DNA-protein interaction.
Treatment provider's knowledge of the Health and Disability Commissioner's Code of Consumer Rights.
Townshend, Philip L; Sellman, J Douglas
2002-06-01
The Health and Disability Commissioner's (HDC) Code of Health and and Disability Consumers' Rights (the Code) defines in law the rights of consumers of health and disability services in New Zealand. In the first few years after the publication health educators, service providers and the HDC extensively promoted the Code. Providers of health and disability services would be expected to be knowledgeable about the areas covered by the Code if it is routinely used in the development and monitoring of treatment plans. In this study knowledge of the Code was tested in a random sample of 217 clinical staff that included medical staff, psychologists and counsellors working in Alcohol and Drug Treatment (A&D) centres in New Zealand. Any response showing awareness of a right, regardless of wording, was taken as a positive response as it was the areas covered by rights rather than their actual wording that was considered to be the important knowledge for providers. The main finding of this research was that 23% of staff surveyed were aware of none of the ten rights in the Code and only 6% were aware of more than five of the ten rights. Relating these data to results from a wider sample of treatment providers raises the possibility that A&D treatment providers are slightly more aware of the content of the Code than a general sample of health and disability service providers however overall awareness of the content of the Code by health providers is very low. These results imply that consumer rights issues are not prominent in the minds of providers perhaps indicating an ethical blind spot on their part. Ignorance of the content of the Code may indicate that the treatment community do not find it a useful working document or alternatively that clinicians are content to rely on their own good intentions to preserve the rights of their patients. Further research will be required to explain this lack of knowledge, however the current situation is that consumers cannot rely on clinicians being aware of the consumer's rights in health and disability services.
Remote state preparation through hyperentangled atomic states
NASA Astrophysics Data System (ADS)
Nawaz, Mehwish; ul-Islam, Rameez-; Ikram, Manzoor
2018-04-01
Hyperentangled states have enhanced channel capacity in quantum processing and have yielded` evident increased communication speed in quantum informatics as a consequence of excessively high information content coded over each quantum entity. In the present article, we intend to demonstrate this fact by utilizing atomic states simultaneously entangled both in internal as well as external degrees of freedom, i.e. the de Broglie motion for remote state preparation (RSP). The results clearly demonstrate that we can efficiently communicate two bit information while manipulating only a single quantum subsystem. The states are prepared and manipulated using atomic Bragg diffraction as well as Ramsey interferometry, both of which are now considered as standard, state of the art tools based on cavity quantum electrodynamics. Since atomic Bragg diffraction is a large interaction time regime and produces spatially well separated, decoherence resistant outputs, the schematics presented here for the RSP offer important perspectives on efficient detection as well as unambiguous information coding and readout. The article summarizes the experimental feasibility of the proposal, culminating with a brief discussion.
A real-time ionospheric model based on GNSS Precise Point Positioning
NASA Astrophysics Data System (ADS)
Tu, Rui; Zhang, Hongping; Ge, Maorong; Huang, Guanwen
2013-09-01
This paper proposes a method of real-time monitoring and modeling the ionospheric Total Electron Content (TEC) by Precise Point Positioning (PPP). Firstly, the ionospheric TEC and receiver’s Differential Code Biases (DCB) are estimated with the undifferenced raw observation in real-time, then the ionospheric TEC model is established based on the Single Layer Model (SLM) assumption and the recovered ionospheric TEC. In this study, phase observations with high precision are directly used instead of phase smoothed code observations. In addition, the DCB estimation is separated from the establishment of the ionospheric model which will limit the impacts of the SLM assumption impacts. The ionospheric model is established at every epoch for real time application. The method is validated with three different GNSS networks on a local, regional, and global basis. The results show that the method is feasible and effective, the real-time ionosphere and DCB results are very consistent with the IGS final products, with a bias of 1-2 TECU and 0.4 ns respectively.
Comparison of GPS receiver DCB estimation methods using a GPS network
NASA Astrophysics Data System (ADS)
Choi, Byung-Kyu; Park, Jong-Uk; Min Roh, Kyoung; Lee, Sang-Jeong
2013-07-01
Two approaches for receiver differential code biases (DCB) estimation using the GPS data obtained from the Korean GPS network (KGN) in South Korea are suggested: the relative and single (absolute) methods. The relative method uses a GPS network, while the single method determines DCBs from a single station only. Their performance was assessed by comparing the receiver DCB values obtained from the relative method with those estimated by the single method. The daily averaged receiver DCBs obtained from the two different approaches showed good agreement for 7 days. The root mean square (RMS) value of those differences is 0.83 nanoseconds (ns). The standard deviation of the receiver DCBs estimated by the relative method was smaller than that of the single method. From these results, it is clear that the relative method can obtain more stable receiver DCBs compared with the single method over a short-term period. Additionally, the comparison between the receiver DCBs obtained by the Korea Astronomy and Space Science Institute (KASI) and those of the IGS Global Ionosphere Maps (GIM) showed a good agreement at 0.3 ns. As the accuracy of DCB values significantly affects the accuracy of ionospheric total electron content (TEC), more studies are needed to ensure the reliability and stability of the estimated receiver DCBs.
Using a Mixed Methods Content Analysis to Analyze Mission Statements from Colleges of Engineering
ERIC Educational Resources Information Center
Creamer, Elizabeth G.; Ghoston, Michelle
2013-01-01
A mixed method design was used to conduct a content analysis of the mission statements of colleges of engineering to map inductively derived codes with the EC 2000 outcomes and to test if any of the codes were significantly associated with institutions with reasonably strong representation of women. Most institution's (25 of 48) mission statement…
Siderits, Richard; Yates, Stacy; Rodriguez, Arelis; Lee, Tina; Rimmer, Cheryl; Roche, Mark
2011-01-01
Quick Response (QR) Codes are standard in supply management and seen with increasing frequency in advertisements. They are now present regularly in healthcare informatics and education. These 2-dimensional square bar codes, originally designed by the Toyota car company, are free of license and have a published international standard. The codes can be generated by free online software and the resulting images incorporated into presentations. The images can be scanned by "smart" phones and tablets using either the iOS or Android platforms, which link the device with the information represented by the QR code (uniform resource locator or URL, online video, text, v-calendar entries, short message service [SMS] and formatted text). Once linked to the device, the information can be viewed at any time after the original presentation, saved in the device or to a Web-based "cloud" repository, printed, or shared with others via email or Bluetooth file transfer. This paper describes how we use QR codes in our tumor board presentations, discusses the benefits, the different QR codes from Web links and how QR codes facilitate the distribution of educational content.
Tan, Ene-Choo; Li, Haixia
2006-07-19
Most of the studies on single nucleotide variations are on substitutions rather than insertions/deletions. In this study, we examined the distribution and characteristics of single nucleotide insertions/deletions (SNindels), using data available from dbSNP for all the human chromosomes. There are almost 300,000 SNindels in the database, of which only 0.8% are validated. They occur at the frequency of 0.887 per 10 kb on average for the whole genome, or approximately 1 for every 11,274 bp. More than half occur in regions with mononucleotide repeats the longest of which is 47 bases. Overall the mononucleotide repeats involving C and G are much shorter than those for A and T. About 12% are surrounded by palindromes. There is general correlation between chromosome size and total number for each chromosome. Inter-chromosomal variation in density ranges from 0.6 to 21.7 per kilobase. The overall spectrum shows very high proportion of SNindel of types -/A and -/T at over 81%. The proportion of -/A and -/T SNindels for each chromosome is correlated to its AT content. Less than half of the SNindels are within or near known genes and even fewer (<0.183%) in coding regions, and more than 1.4% of -/C and -/G are in coding compared to 0.2% for -/A and -/T types. SNindels of -/A and -/T types make up 80% of those found within untranslated regions but less than 40% of those within coding regions. A separate analysis using the subset of 2324 validated SNindels showed slightly less AT bias of 74%, SNindels not within mononucleotide repeats showed even less AT bias at 58%. Density of validated SNindels is 0.007/10 kb overall and 90% are found within or near genes. Among all chromosomes, Y has the lowest numbers and densities for all SNindels, validated SNindels, and SNindels not within repeats.
Salvatore, Sergio; Gennaro, Alessandro; Auletta, Andrea; Grassi, Rossano; Rocco, Diego
2012-12-01
The paper presents a method of content analysis framed within a semiotic and contextual model of the psychotherapy process as a situated dynamics of sensemaking: the Dynamic Mapping of the Structures of Content in Clinical Settings (DMSC). DMSC is a system of content analysis focused on a generalized level of meaning, concerning basic aspects of the patient's narrative (e.g., if the narrative concerns herself or other than herself). The paper presents the result of the application of DMSC to an intensive single-case analysis (Katja). The method has been applied by judges to the transcripts of sessions and is aimed at identifying patterns of combinations (defined: Patterns of content) of the categories characterizing the patient's narratives (pattern analysis approach) as well as at mapping the transition among these patterns (sequential analysis approach). These results provide evidence of its construct validity. In accordance with the theoretical model grounding the method, we have found that: (a) DMSC provides a meaningful representation of the patient's narratives in terms of Patterns of content; (b) the probability of transition among the Patterns of content have proved to be significantly associated with the clinical quality of the sessions. The DMSC has to be considered an attempt paving the way for further investigations aimed at developing a deeper understanding of the role played by the dynamics of sensemaking in the psychotherapy process. ©2011 The British Psychological Society.
Pitchiaya, Sethuramasundaram; Krishnan, Vishalakshi; Custer, Thomas C.; Walter, Nils G.
2013-01-01
Non-coding RNAs (ncRNAs) recently were discovered to outnumber their protein-coding counterparts, yet their diverse functions are still poorly understood. Here we report on a method for the intracellular Single-molecule High Resolution Localization and Counting (iSHiRLoC) of microRNAs (miRNAs), a conserved, ubiquitous class of regulatory ncRNAs that controls the expression of over 60% of all mammalian protein coding genes post-transcriptionally, by a mechanism shrouded by seemingly contradictory observations. We present protocols to execute single particle tracking (SPT) and single-molecule counting of functional microinjected, fluorophore-labeled miRNAs and thereby extract diffusion coefficients and molecular stoichiometries of micro-ribonucleoprotein (miRNP) complexes from living and fixed cells, respectively. This probing of miRNAs at the single molecule level sheds new light on the intracellular assembly/disassembly of miRNPs, thus beginning to unravel the dynamic nature of this important gene regulatory pathway and facilitating the development of a parsimonious model for their obscured mechanism of action. PMID:23820309
Embed dynamic content in your poster.
Hutchins, B Ian
2013-01-29
A new technology has emerged that will facilitate the presentation of dynamic or otherwise inaccessible data on posters at scientific meetings. Video, audio, or other digital files hosted on mobile-friendly sites can be linked to through a quick response (QR) code, a two-dimensional barcode that can be scanned by smartphones, which then display the content. This approach is more affordable than acquiring tablet computers for playing dynamic content and can reach many users at large conferences. This resource details how to host videos, generate QR codes, and view the associated files on mobile devices.
Yang, Qi; Al Amin, Abdullah; Chen, Xi; Ma, Yiran; Chen, Simin; Shieh, William
2010-08-02
High-order modulation formats and advanced error correcting codes (ECC) are two promising techniques for improving the performance of ultrahigh-speed optical transport networks. In this paper, we present record receiver sensitivity for 107 Gb/s CO-OFDM transmission via constellation expansion to 16-QAM and rate-1/2 LDPC coding. We also show the single-channel transmission of a 428-Gb/s CO-OFDM signal over 960-km standard-single-mode-fiber (SSMF) without Raman amplification.
Patient complaints in healthcare systems: a systematic review and coding taxonomy
Reader, Tom W; Gillespie, Alex; Roberts, Jane
2014-01-01
Background Patient complaints have been identified as a valuable resource for monitoring and improving patient safety. This article critically reviews the literature on patient complaints, and synthesises the research findings to develop a coding taxonomy for analysing patient complaints. Methods The PubMed, Science Direct and Medline databases were systematically investigated to identify patient complaint research studies. Publications were included if they reported primary quantitative data on the content of patient-initiated complaints. Data were extracted and synthesised on (1) basic study characteristics; (2) methodological details; and (3) the issues patients complained about. Results 59 studies, reporting 88 069 patient complaints, were included. Patient complaint coding methodologies varied considerably (eg, in attributing single or multiple causes to complaints). In total, 113 551 issues were found to underlie the patient complaints. These were analysed using 205 different analytical codes which when combined represented 29 subcategories of complaint issue. The most common issues complained about were ‘treatment’ (15.6%) and ‘communication’ (13.7%). To develop a patient complaint coding taxonomy, the subcategories were thematically grouped into seven categories, and then three conceptually distinct domains. The first domain related to complaints on the safety and quality of clinical care (representing 33.7% of complaint issues), the second to the management of healthcare organisations (35.1%) and the third to problems in healthcare staff–patient relationships (29.1%). Conclusions Rigorous analyses of patient complaints will help to identify problems in patient safety. To achieve this, it is necessary to standardise how patient complaints are analysed and interpreted. Through synthesising data from 59 patient complaint studies, we propose a coding taxonomy for supporting future research and practice in the analysis of patient complaint data. PMID:24876289
Protein functional features are reflected in the patterns of mRNA translation speed.
López, Daniel; Pazos, Florencio
2015-07-09
The degeneracy of the genetic code makes it possible for the same amino acid string to be coded by different messenger RNA (mRNA) sequences. These "synonymous mRNAs" may differ largely in a number of aspects related to their overall translational efficiency, such as secondary structure content and availability of the encoded transfer RNAs (tRNAs). Consequently, they may render different yields of the translated polypeptides. These mRNA features related to translation efficiency are also playing a role locally, resulting in a non-uniform translation speed along the mRNA, which has been previously related to some protein structural features and also used to explain some dramatic effects of "silent" single-nucleotide-polymorphisms (SNPs). In this work we perform the first large scale analysis of the relationship between three experimental proxies of mRNA local translation efficiency and the local features of the corresponding encoded proteins. We found that a number of protein functional and structural features are reflected in the patterns of ribosome occupancy, secondary structure and tRNA availability along the mRNA. One or more of these proxies of translation speed have distinctive patterns around the mRNA regions coding for certain protein local features. In some cases the three patterns follow a similar trend. We also show specific examples where these patterns of translation speed point to the protein's important structural and functional features. This support the idea that the genome not only codes the protein functional features as sequences of amino acids, but also as subtle patterns of mRNA properties which, probably through local effects on the translation speed, have some consequence on the final polypeptide. These results open the possibility of predicting a protein's functional regions based on a single genomic sequence, and have implications for heterologous protein expression and fine-tuning protein function.
Christophel, Thomas B; Allefeld, Carsten; Endisch, Christian; Haynes, John-Dylan
2018-06-01
Traditional views of visual working memory postulate that memorized contents are stored in dorsolateral prefrontal cortex using an adaptive and flexible code. In contrast, recent studies proposed that contents are maintained by posterior brain areas using codes akin to perceptual representations. An important question is whether this reflects a difference in the level of abstraction between posterior and prefrontal representations. Here, we investigated whether neural representations of visual working memory contents are view-independent, as indicated by rotation-invariance. Using functional magnetic resonance imaging and multivariate pattern analyses, we show that when subjects memorize complex shapes, both posterior and frontal brain regions maintain the memorized contents using a rotation-invariant code. Importantly, we found the representations in frontal cortex to be localized to the frontal eye fields rather than dorsolateral prefrontal cortices. Thus, our results give evidence for the view-independent storage of complex shapes in distributed representations across posterior and frontal brain regions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Earl, Christopher; Might, Matthew; Bagusetty, Abhishek
This study presents Nebo, a declarative domain-specific language embedded in C++ for discretizing partial differential equations for transport phenomena on multiple architectures. Application programmers use Nebo to write code that appears sequential but can be run in parallel, without editing the code. Currently Nebo supports single-thread execution, multi-thread execution, and many-core (GPU-based) execution. With single-thread execution, Nebo performs on par with code written by domain experts. With multi-thread execution, Nebo can linearly scale (with roughly 90% efficiency) up to 12 cores, compared to its single-thread execution. Moreover, Nebo’s many-core execution can be over 140x faster than its single-thread execution.
Earl, Christopher; Might, Matthew; Bagusetty, Abhishek; ...
2016-01-26
This study presents Nebo, a declarative domain-specific language embedded in C++ for discretizing partial differential equations for transport phenomena on multiple architectures. Application programmers use Nebo to write code that appears sequential but can be run in parallel, without editing the code. Currently Nebo supports single-thread execution, multi-thread execution, and many-core (GPU-based) execution. With single-thread execution, Nebo performs on par with code written by domain experts. With multi-thread execution, Nebo can linearly scale (with roughly 90% efficiency) up to 12 cores, compared to its single-thread execution. Moreover, Nebo’s many-core execution can be over 140x faster than its single-thread execution.
Summary statistics in the attentional blink.
McNair, Nicolas A; Goodbourn, Patrick T; Shone, Lauren T; Harris, Irina M
2017-01-01
We used the attentional blink (AB) paradigm to investigate the processing stage at which extraction of summary statistics from visual stimuli ("ensemble coding") occurs. Experiment 1 examined whether ensemble coding requires attentional engagement with the items in the ensemble. Participants performed two sequential tasks on each trial: gender discrimination of a single face (T1) and estimating the average emotional expression of an ensemble of four faces (or of a single face, as a control condition) as T2. Ensemble coding was affected by the AB when the tasks were separated by a short temporal lag. In Experiment 2, the order of the tasks was reversed to test whether ensemble coding requires more working-memory resources, and therefore induces a larger AB, than estimating the expression of a single face. Each condition produced a similar magnitude AB in the subsequent gender-discrimination T2 task. Experiment 3 additionally investigated whether the previous results were due to participants adopting a subsampling strategy during the ensemble-coding task. Contrary to this explanation, we found different patterns of performance in the ensemble-coding condition and a condition in which participants were instructed to focus on only a single face within an ensemble. Taken together, these findings suggest that ensemble coding emerges automatically as a result of the deployment of attentional resources across the ensemble of stimuli, prior to information being consolidated in working memory.
Directed educational training improves coding and billing skills for residents.
Benke, James R; Lin, Sandra Y; Ishman, Stacey L
2013-03-01
To determine if coding and billing acumen improves after a single directed educational training session. Case-control series. Fourteen otolaryngology practitioners including trainees each completed two clinical scenarios before and after a directed educational session covering basic skills and common mistakes in otolaryngology billing and coding. Ten practitioners had never coded before; while, four regularly billed and coded in a clinical setting. Individuals with no previous billing experience had a mean score of 54% (median 55%) before the educational session which was significantly lower than that of the experienced billers who averaged 82% (median 83%, p=0.002). After the educational billing and coding session, the inexperienced billers mean score improved to 62% (median, 67%) which was still statistically lower than that of the experienced billers who averaged 76% (median 75%, p=0.039). The inexperienced billers demonstrated a significant improvement in their total score after the intervention (P=0.019); however, the change observed in experienced billers before and after the educational intervention was not significant (P=0.469). Billing and coding skill was improved after a single directed education session. Residents, who are not responsible for regular billing and coding, were found to have the greatest improvement in skill. However, providers who regularly bill and code had no significant improvement after this session. These data suggest that a single 90min billing and coding education session is effective in preparing those with limited experience to competently bill and code. Copyright © 2012. Published by Elsevier Ireland Ltd.
Fast decoding techniques for extended single-and-double-error-correcting Reed Solomon codes
NASA Technical Reports Server (NTRS)
Costello, D. J., Jr.; Deng, H.; Lin, S.
1984-01-01
A problem in designing semiconductor memories is to provide some measure of error control without requiring excessive coding overhead or decoding time. For example, some 256K-bit dynamic random access memories are organized as 32K x 8 bit-bytes. Byte-oriented codes such as Reed Solomon (RS) codes provide efficient low overhead error control for such memories. However, the standard iterative algorithm for decoding RS codes is too slow for these applications. Some special high speed decoding techniques for extended single and double error correcting RS codes. These techniques are designed to find the error locations and the error values directly from the syndrome without having to form the error locator polynomial and solve for its roots.
Development of a Coding Instrument to Assess the Quality and Content of Anti-Tobacco Video Games.
Alber, Julia M; Watson, Anna M; Barnett, Tracey E; Mercado, Rebeccah; Bernhardt, Jay M
2015-07-01
Previous research has shown the use of electronic video games as an effective method for increasing content knowledge about the risks of drugs and alcohol use for adolescents. Although best practice suggests that theory, health communication strategies, and game appeal are important characteristics for developing games, no instruments are currently available to examine the quality and content of tobacco prevention and cessation electronic games. This study presents the systematic development of a coding instrument to measure the quality, use of theory, and health communication strategies of tobacco cessation and prevention electronic games. Using previous research and expert review, a content analysis coding instrument measuring 67 characteristics was developed with three overarching categories: type and quality of games, theory and approach, and type and format of messages. Two trained coders applied the instrument to 88 games on four platforms (personal computer, Nintendo DS, iPhone, and Android phone) to field test the instrument. Cohen's kappa for each item ranged from 0.66 to 1.00, with an average kappa value of 0.97. Future research can adapt this coding instrument to games addressing other health issues. In addition, the instrument questions can serve as a useful guide for evidence-based game development.
What does music express? Basic emotions and beyond.
Juslin, Patrik N
2013-01-01
Numerous studies have investigated whether music can reliably convey emotions to listeners, and-if so-what musical parameters might carry this information. Far less attention has been devoted to the actual contents of the communicative process. The goal of this article is thus to consider what types of emotional content are possible to convey in music. I will argue that the content is mainly constrained by the type of coding involved, and that distinct types of content are related to different types of coding. Based on these premises, I suggest a conceptualization in terms of "multiple layers" of musical expression of emotions. The "core" layer is constituted by iconically-coded basic emotions. I attempt to clarify the meaning of this concept, dispel the myths that surround it, and provide examples of how it can be heuristic in explaining findings in this domain. However, I also propose that this "core" layer may be extended, qualified, and even modified by additional layers of expression that involve intrinsic and associative coding. These layers enable listeners to perceive more complex emotions-though the expressions are less cross-culturally invariant and more dependent on the social context and/or the individual listener. This multiple-layer conceptualization of expression in music can help to explain both similarities and differences between vocal and musical expression of emotions.
Development of a Coding Instrument to Assess the Quality and Content of Anti-Tobacco Video Games
Alber, Julia M.; Watson, Anna M.; Barnett, Tracey E.; Mercado, Rebeccah
2015-01-01
Abstract Previous research has shown the use of electronic video games as an effective method for increasing content knowledge about the risks of drugs and alcohol use for adolescents. Although best practice suggests that theory, health communication strategies, and game appeal are important characteristics for developing games, no instruments are currently available to examine the quality and content of tobacco prevention and cessation electronic games. This study presents the systematic development of a coding instrument to measure the quality, use of theory, and health communication strategies of tobacco cessation and prevention electronic games. Using previous research and expert review, a content analysis coding instrument measuring 67 characteristics was developed with three overarching categories: type and quality of games, theory and approach, and type and format of messages. Two trained coders applied the instrument to 88 games on four platforms (personal computer, Nintendo DS, iPhone, and Android phone) to field test the instrument. Cohen's kappa for each item ranged from 0.66 to 1.00, with an average kappa value of 0.97. Future research can adapt this coding instrument to games addressing other health issues. In addition, the instrument questions can serve as a useful guide for evidence-based game development. PMID:26167842
Redwan, R M; Saidin, A; Kumar, S V
2015-08-12
Pineapple (Ananas comosus var. comosus) is known as the king of fruits for its crown and is the third most important tropical fruit after banana and citrus. The plant, which is indigenous to South America, is the most important species in the Bromeliaceae family and is largely traded for fresh fruit consumption. Here, we report the complete chloroplast sequence of the MD-2 pineapple that was sequenced using the PacBio sequencing technology. In this study, the high error rate of PacBio long sequence reads of A. comosus's total genomic DNA were improved by leveraging on the high accuracy but short Illumina reads for error-correction via the latest error correction module from Novocraft. Error corrected long PacBio reads were assembled by using a single tool to produce a contig representing the pineapple chloroplast genome. The genome of 159,636 bp in length is featured with the conserved quadripartite structure of chloroplast containing a large single copy region (LSC) with a size of 87,482 bp, a small single copy region (SSC) with a size of 18,622 bp and two inverted repeat regions (IRA and IRB) each with the size of 26,766 bp. Overall, the genome contained 117 unique coding regions and 30 were repeated in the IR region with its genes contents, structure and arrangement similar to its sister taxon, Typha latifolia. A total of 35 repeats structure were detected in both the coding and non-coding regions with a majority being tandem repeats. In addition, 205 SSRs were detected in the genome with six protein-coding genes contained more than two SSRs. Comparative chloroplast genomes from the subclass Commelinidae revealed a conservative protein coding gene albeit located in a highly divergence region. Analysis of selection pressure on protein-coding genes using Ka/Ks ratio showed significant positive selection exerted on the rps7 gene of the pineapple chloroplast with P less than 0.05. Phylogenetic analysis confirmed the recent taxonomical relation among the member of commelinids which support the monophyly relationship between Arecales and Dasypogonaceae and between Zingiberales to the Poales, which includes the A. comosus. The complete sequence of the chloroplast of pineapple provides insights to the divergence of genic chloroplast sequences from the members of the subclass Commelinidae. The complete pineapple chloroplast will serve as a reference for in-depth taxonomical studies in the Bromeliaceae family when more species under the family are sequenced in the future. The genetic sequence information will also make feasible other molecular applications of the pineapple chloroplast for plant genetic improvement.
Single-channel voice-response-system program documentation volume I : system description
DOT National Transportation Integrated Search
1977-01-01
This report documents the design and implementation of a Voice Response System (VRS) using Adaptive Differential Pulse Code Modulation (ADPCM) voice coding. Implemented on a Digital Equipment Corporation PDP-11/20,R this VRS system supports a single ...
Natural variations in OsγTMT contribute to diversity of the α-tocopherol content in rice.
Wang, Xiao-Qiang; Yoon, Min-Young; He, Qiang; Kim, Tae-Sung; Tong, Wei; Choi, Bu-Woong; Lee, Young-Sang; Park, Yong-Jin
2015-12-01
Tocopherols and tocotrienols, collectively known as tocochromanols, are lipid-soluble molecules that belong to the group of vitamin E compounds. Among them, α-tocopherol (αΤ) is one of the antioxidants with diverse functions and benefits for humans and animals. Thus, understanding the genetic basis of these traits would be valuable to improve nutritional quality by breeding in rice. Genome-wide association study (GWAS) has emerged as a powerful strategy for identifying genes or quantitative trait loci (QTL) underlying complex traits in plants. To discover the genes or QTLs underlying the naturally occurring variations of αΤ content in rice, we performed GWAS using 1.44 million high-quality single-nucleotide polymorphisms acquired from re-sequencing of 137 accessions from a diverse rice core collection. Thirteen candidate genes were found across 2-year phenotypic data, among which gamma-tocopherol methyltransferase (OsγTMT) was identified as the major factor responsible for the αΤ content among rice accessions. Nucleotide variations in the coding region of OsγTMT were significantly associated with the αΤ content variations, while nucleotide polymorphisms in the promoter region of OsγTMT also could partly demonstrate the correlation with αΤ content variations, according to our RNA expression analyses. This study provides useful information for genetic factors underlying αΤ content variations in rice, which will significantly contribute the research on αΤ biosynthesis mechanisms and αΤ improvement of rice.
Van, K; Onoda, S; Kim, M Y; Kim, K D; Lee, S-H
2008-03-01
The Waxy (Wx) gene product controls the formation of a straight chain polymer of amylose in the starch pathway. Dominance/recessiveness of the Wx allele is associated with amylose content, leading to non-waxy/waxy phenotypes. For a total of 113 foxtail millet accessions, agronomic traits and the molecular differences of the Wx gene were surveyed to evaluate genetic diversities. Molecular types were associated with phenotypes determined by four specific primer sets (non-waxy, Type I; low amylose, Type VI; waxy, Type IV or V). Additionally, the insertion of transposable element in waxy was confirmed by ex1/TSI2R, TSI2F/ex2, ex2int2/TSI7R and TSI7F/ex4r. Seventeen single nucleotide polymorphims (SNPs) were observed from non-coding regions, while three SNPs from coding regions were non-synonymous. Interestingly, the phenotype of No. 88 was still non-waxy, although seven nucleotides (AATTGGT) insertion at 2,993 bp led to 78 amino acids shorter. The rapid decline of r (2) in the sequenced region (exon 1-intron 1-exon 2) suggested a low level of linkage disequilibrium and limited haplotype structure. K (s) values and estimation of evolutionary events indicate early divergence of S. italica among cereal crops. This study suggested the Wx gene was one of the targets in the selection process during domestication.
Xie, Qing; Shen, Kang-Ning; Hao, Xiuying; Nam, Phan Nhut; Ngoc Hieu, Bui Thi; Chen, Ching-Hung; Zhu, Changqing; Lin, Yen-Chang; Hsiao, Chung-Der
2017-03-01
abtract We decoded the complete chloroplast DNA (cpDNA) sequence of the Tianshan Snow Lotus (Saussurea involucrata), a famous traditional Chinese medicinal plant of the family Asteraceae, by using next-generation sequencing technology. The genome consists of 152 490 bp containing a pair of inverted repeats (IRs) of 25 202 bp, which was separated by a large single-copy region and a small single-copy region of 83 446 bp and 18 639 bp, respectively. The genic regions account for 57.7% of whole cpDNA, and the GC content of the cpDNA was 37.7%. The S. involucrata cpDNA encodes 114 unigenes (82 protein-coding genes, 4 rRNA genes, and 28 tRNA genes). There are eight protein-coding genes (atpF, ndhA, ndhB, rpl2, rpoC1, rps16, clpP, and ycf3) and five tRNA genes (trnA-UGC, trnI-GAU, trnK-UUU, trnL-UAA, and trnV-UAC) containing introns. A phylogenetic analysis of the 11 complete cpDNA from Asteracease showed that S. involucrata is closely related to Centaurea diffusa (Diffuse Knapweed). The complete cpDNA of S. involucrata provides essential and important DNA molecular data for further phylogenetic and evolutionary analysis for Asteraceae.
ERIC Educational Resources Information Center
Du, Jie; Wimmer, Hayden; Rada, Roy
2018-01-01
This study investigates the delivery of the "Hour of Code" tutorials to college students. The college students who participated in this study were surveyed about their opinion of the Hour of Code. First, the students' comments were discussed. Next, a content analysis of the offered tutorials highlights their reliance on visual…
Theta Oscillations Rapidly Convey Odor-Specific Content in Human Piriform Cortex.
Jiang, Heidi; Schuele, Stephan; Rosenow, Joshua; Zelano, Christina; Parvizi, Josef; Tao, James X; Wu, Shasha; Gottfried, Jay A
2017-04-05
Olfactory oscillations are pervasive throughout vertebrate and invertebrate nervous systems. Such observations have long implied that rhythmic activity patterns play a fundamental role in odor coding. Using intracranial EEG recordings from rare patients with medically resistant epilepsy, we find that theta oscillations are a distinct electrophysiological signature of olfactory processing in the human brain. Across seven patients, odor stimulation enhanced theta power in human piriform cortex, with robust effects at the level of single trials. Importantly, classification analysis revealed that piriform oscillatory activity conveys olfactory-specific information that can be decoded within 110-518 ms of a sniff, and maximally within the theta frequency band. This temporal window was also associated with increased theta-specific phase coupling between piriform cortex and hippocampus. Together these findings suggest that human piriform cortex has access to olfactory content in the time-frequency domain and can utilize these signals to rapidly differentiate odor stimuli. Copyright © 2017 Elsevier Inc. All rights reserved.
Applying a rateless code in content delivery networks
NASA Astrophysics Data System (ADS)
Suherman; Zarlis, Muhammad; Parulian Sitorus, Sahat; Al-Akaidi, Marwan
2017-09-01
Content delivery network (CDN) allows internet providers to locate their services, to map their coverage into networks without necessarily to own them. CDN is part of the current internet infrastructures, supporting multi server applications especially social media. Various works have been proposed to improve CDN performances. Since accesses on social media servers tend to be short but frequent, providing redundant to the transmitted packets to ensure lost packets not degrade the information integrity may improve service performances. This paper examines the implementation of rateless code in the CDN infrastructure. The NS-2 evaluations show that rateless code is able to reduce packet loss up to 50%.
Jahanbin, Arezoo; Farzanegan, Fahimeh; Atai, Mohammad; Jamehdar, Saeed Amel; Golfakhrabadi, Parvaneh; Shafaee, Hooman
2017-02-01
The aim of this 'split-mouth design' trial was to evaluate the effect of the nano amorphous calcium phosphate (NACP) containing composite on enamel mineral contents and streptococcus mutans population in fixed orthodontic patients. Randomized, prospective, single-center controlled trial. Twenty-four patients between the ages of 13-18 years participated in this study. The control and test sides were randomly selected by a coin toss (1:1 ratio). On the control side orthodontic brackets were bonded on the buccal surfaces of upper premolars and laterals using an orthodontic composite (Transbond XT), and on the study side NACP-containing composite was used. Outcome measures were the mineral content around the brackets and S.mutans count. The later were calculated in the plaque around the brackets by real-time PCR at 3 months, and 6 months after the initiation of treatment. All stages of the study were blind using coding system. Paired t-test and repeated measurements were used for data analysis. In the third and sixth month, the bacterial population was significantly lower in the study side than the control side (P = 0.01 and 0.000).The mineral content of the study side was significantly higher than the controls, 6 months after brocket bonding (P = 0.004). There were no significant differences between the premolars and lateral teeth for all measurements. This research was performed in a single-center by one experienced clinician. NACP-containing composites have the potential to inhibit mineral content loss and S.mutans colonization around orthodontic brackets during fixed orthodontic treatments. This trial was not registered. The protocol was not published before trial commencement. © The Author 2016. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Detecting well-being via computerized content analysis of brief diary entries.
Tov, William; Ng, Kok Leong; Lin, Han; Qiu, Lin
2013-12-01
Two studies evaluated the correspondence between self-reported well-being and codings of emotion and life content by the Linguistic Inquiry and Word Count (LIWC; Pennebaker, Booth, & Francis, 2011). Open-ended diary responses were collected from 206 participants daily for 3 weeks (Study 1) and from 139 participants twice a week for 8 weeks (Study 2). LIWC negative emotion consistently correlated with self-reported negative emotion. LIWC positive emotion correlated with self-reported positive emotion in Study 1 but not in Study 2. No correlations were observed with global life satisfaction. Using a co-occurrence coding method to combine LIWC emotion codings with life-content codings, we estimated the frequency of positive and negative events in 6 life domains (family, friends, academics, health, leisure, and money). Domain-specific event frequencies predicted self-reported satisfaction in all domains in Study 1 but not consistently in Study 2. We suggest that the correspondence between LIWC codings and self-reported well-being is affected by the number of writing samples collected per day as well as the target period (e.g., past day vs. past week) assessed by the self-report measure. Extensions and possible implications for the analyses of similar types of open-ended data (e.g., social media messages) are discussed. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Abstract feature codes: The building blocks of the implicit learning system.
Eberhardt, Katharina; Esser, Sarah; Haider, Hilde
2017-07-01
According to the Theory of Event Coding (TEC; Hommel, Müsseler, Aschersleben, & Prinz, 2001), action and perception are represented in a shared format in the cognitive system by means of feature codes. In implicit sequence learning research, it is still common to make a conceptual difference between independent motor and perceptual sequences. This supposedly independent learning takes place in encapsulated modules (Keele, Ivry, Mayr, Hazeltine, & Heuer 2003) that process information along single dimensions. These dimensions have remained underspecified so far. It is especially not clear whether stimulus and response characteristics are processed in separate modules. Here, we suggest that feature dimensions as they are described in the TEC should be viewed as the basic content of modules of implicit learning. This means that the modules process all stimulus and response information related to certain feature dimensions of the perceptual environment. In 3 experiments, we investigated by means of a serial reaction time task the nature of the basic units of implicit learning. As a test case, we used stimulus location sequence learning. The results show that a stimulus location sequence and a response location sequence cannot be learned without interference (Experiment 2) unless one of the sequences can be coded via an alternative, nonspatial dimension (Experiment 3). These results support the notion that spatial location is one module of the implicit learning system and, consequently, that there are no separate processing units for stimulus versus response locations. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Mobile Code: The Future of the Internet
1999-01-01
code ( mobile agents) to multiple proxies or servers " Customization " (e.g., re-formatting, filtering, metasearch) Information overload Diversified... Mobile code is necessary, rather than client-side code, since many customization features (such as information monitoring) do not work if the...economic foundation for Web sites, many Web sites earn money solely from advertisements . If these sites allow mobile agents to easily access the content
Quantum steganography and quantum error-correction
NASA Astrophysics Data System (ADS)
Shaw, Bilal A.
Quantum error-correcting codes have been the cornerstone of research in quantum information science (QIS) for more than a decade. Without their conception, quantum computers would be a footnote in the history of science. When researchers embraced the idea that we live in a world where the effects of a noisy environment cannot completely be stripped away from the operations of a quantum computer, the natural way forward was to think about importing classical coding theory into the quantum arena to give birth to quantum error-correcting codes which could help in mitigating the debilitating effects of decoherence on quantum data. We first talk about the six-qubit quantum error-correcting code and show its connections to entanglement-assisted error-correcting coding theory and then to subsystem codes. This code bridges the gap between the five-qubit (perfect) and Steane codes. We discuss two methods to encode one qubit into six physical qubits. Each of the two examples corrects an arbitrary single-qubit error. The first example is a degenerate six-qubit quantum error-correcting code. We explicitly provide the stabilizer generators, encoding circuits, codewords, logical Pauli operators, and logical CNOT operator for this code. We also show how to convert this code into a non-trivial subsystem code that saturates the subsystem Singleton bound. We then prove that a six-qubit code without entanglement assistance cannot simultaneously possess a Calderbank-Shor-Steane (CSS) stabilizer and correct an arbitrary single-qubit error. A corollary of this result is that the Steane seven-qubit code is the smallest single-error correcting CSS code. Our second example is the construction of a non-degenerate six-qubit CSS entanglement-assisted code. This code uses one bit of entanglement (an ebit) shared between the sender (Alice) and the receiver (Bob) and corrects an arbitrary single-qubit error. The code we obtain is globally equivalent to the Steane seven-qubit code and thus corrects an arbitrary error on the receiver's half of the ebit as well. We prove that this code is the smallest code with a CSS structure that uses only one ebit and corrects an arbitrary single-qubit error on the sender's side. We discuss the advantages and disadvantages for each of the two codes. In the second half of this thesis we explore the yet uncharted and relatively undiscovered area of quantum steganography. Steganography is the process of hiding secret information by embedding it in an "innocent" message. We present protocols for hiding quantum information in a codeword of a quantum error-correcting code passing through a channel. Using either a shared classical secret key or shared entanglement Alice disguises her information as errors in the channel. Bob can retrieve the hidden information, but an eavesdropper (Eve) with the power to monitor the channel, but without the secret key, cannot distinguish the message from channel noise. We analyze how difficult it is for Eve to detect the presence of secret messages, and estimate rates of steganographic communication and secret key consumption for certain protocols. We also provide an example of how Alice hides quantum information in the perfect code when the underlying channel between Bob and her is the depolarizing channel. Using this scheme Alice can hide up to four stego-qubits.
A Mixed-Methods Study of Patient-Provider E-mail Content in a Safety-Net Setting
Mirsky, Jacob B.; Tieu, Lina; Lyles, Courtney; Sarkar, Urmimala
2016-01-01
Objective To explore the content of patient-provider e-mails in a safety-net primary care clinic. Methods We conducted a content analysis using inductive and deductive coding of e-mail exchanges (n=31) collected from January through November of 2013. Participants were English-speaking adult patients with a chronic condition (or their caregivers) cared for at a single publicly-funded general internal medicine clinic and their primary care providers (attending general internist physicians, clinical fellows, internal medicine residents, and nurse practitioners). Results All e-mails were non-urgent. Patients included a medical update in 19% of all e-mails. Patients requested action in 77% of e-mails, and the most common requests overall were for action regarding medications or treatment (29%). Requests for information were less common (45% of e-mails). Patient requests (n=56) were resolved in 84% of e-mail exchanges, resulting in 63 actions. Conclusion Patients in safety-net clinics are capable of safely and effectively using electronic messaging for between-visit communication with providers. Practical Implications Safety-net systems should implement electronic communications tools as soon as possible to increase healthcare access and enhance patient involvement in their care. PMID:26332306
An approach for coupled-code multiphysics core simulations from a common input
Schmidt, Rodney; Belcourt, Kenneth; Hooper, Russell; ...
2014-12-10
This study describes an approach for coupled-code multiphysics reactor core simulations that is being developed by the Virtual Environment for Reactor Applications (VERA) project in the Consortium for Advanced Simulation of Light-Water Reactors (CASL). In this approach a user creates a single problem description, called the “VERAIn” common input file, to define and setup the desired coupled-code reactor core simulation. A preprocessing step accepts the VERAIn file and generates a set of fully consistent input files for the different physics codes being coupled. The problem is then solved using a single-executable coupled-code simulation tool applicable to the problem, which ismore » built using VERA infrastructure software tools and the set of physics codes required for the problem of interest. The approach is demonstrated by performing an eigenvalue and power distribution calculation of a typical three-dimensional 17 × 17 assembly with thermal–hydraulic and fuel temperature feedback. All neutronics aspects of the problem (cross-section calculation, neutron transport, power release) are solved using the Insilico code suite and are fully coupled to a thermal–hydraulic analysis calculated by the Cobra-TF (CTF) code. The single-executable coupled-code (Insilico-CTF) simulation tool is created using several VERA tools, including LIME (Lightweight Integrating Multiphysics Environment for coupling codes), DTK (Data Transfer Kit), Trilinos, and TriBITS. Parallel calculations are performed on the Titan supercomputer at Oak Ridge National Laboratory using 1156 cores, and a synopsis of the solution results and code performance is presented. Finally, ongoing development of this approach is also briefly described.« less
Content Analysis Coding Schemes for Online Asynchronous Discussion
ERIC Educational Resources Information Center
Weltzer-Ward, Lisa
2011-01-01
Purpose: Researchers commonly utilize coding-based analysis of classroom asynchronous discussion contributions as part of studies of online learning and instruction. However, this analysis is inconsistent from study to study with over 50 coding schemes and procedures applied in the last eight years. The aim of this article is to provide a basis…
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Osborne, Richard N.
2005-01-01
The RoseDoclet computer program extends the capability of Java doclet software to automatically synthesize Unified Modeling Language (UML) content from Java language source code. [Doclets are Java-language programs that use the doclet application programming interface (API) to specify the content and format of the output of Javadoc. Javadoc is a program, originally designed to generate API documentation from Java source code, now also useful as an extensible engine for processing Java source code.] RoseDoclet takes advantage of Javadoc comments and tags already in the source code to produce a UML model of that code. RoseDoclet applies the doclet API to create a doclet passed to Javadoc. The Javadoc engine applies the doclet to the source code, emitting the output format specified by the doclet. RoseDoclet emits a Rose model file and populates it with fully documented packages, classes, methods, variables, and class diagrams identified in the source code. The way in which UML models are generated can be controlled by use of new Javadoc comment tags that RoseDoclet provides. The advantage of using RoseDoclet is that Javadoc documentation becomes leveraged for two purposes: documenting the as-built API and keeping the design documentation up to date.
What does music express? Basic emotions and beyond
Juslin, Patrik N.
2013-01-01
Numerous studies have investigated whether music can reliably convey emotions to listeners, and—if so—what musical parameters might carry this information. Far less attention has been devoted to the actual contents of the communicative process. The goal of this article is thus to consider what types of emotional content are possible to convey in music. I will argue that the content is mainly constrained by the type of coding involved, and that distinct types of content are related to different types of coding. Based on these premises, I suggest a conceptualization in terms of “multiple layers” of musical expression of emotions. The “core” layer is constituted by iconically-coded basic emotions. I attempt to clarify the meaning of this concept, dispel the myths that surround it, and provide examples of how it can be heuristic in explaining findings in this domain. However, I also propose that this “core” layer may be extended, qualified, and even modified by additional layers of expression that involve intrinsic and associative coding. These layers enable listeners to perceive more complex emotions—though the expressions are less cross-culturally invariant and more dependent on the social context and/or the individual listener. This multiple-layer conceptualization of expression in music can help to explain both similarities and differences between vocal and musical expression of emotions. PMID:24046758
Assessing Teachers' Science Content Knowledge: A Strategy for Assessing Depth of Understanding
NASA Astrophysics Data System (ADS)
McConnell, Tom J.; Parker, Joyce M.; Eberhardt, Jan
2013-06-01
One of the characteristics of effective science teachers is a deep understanding of science concepts. The ability to identify, explain and apply concepts is critical in designing, delivering and assessing instruction. Because some teachers have not completed extensive courses in some areas of science, especially in middle and elementary grades, many professional development programs attempt to strengthen teachers' content knowledge. Assessing this content knowledge is challenging. Concept inventories are reliable and efficient, but do not reveal depth of knowledge. Interviews and observations are time-consuming. The Problem Based Learning Project for Teachers implemented a strategy that includes pre-post instruments in eight content strands that permits blind coding of responses and comparison across teachers and groups of teachers. The instruments include two types of open-ended questions that assess both general knowledge and the ability to apply Big Ideas related to specific science topics. The coding scheme is useful in revealing patterns in prior knowledge and learning, and identifying ideas that are challenging or not addressed by learning activities. The strengths and limitations of the scoring scheme are identified through comparison of the findings to case studies of four participating teachers from middle and elementary schools. The cases include examples of coded pre- and post-test responses to illustrate some of the themes seen in teacher learning. The findings raise questions for future investigation that can be conducted using analyses of the coded responses.
Malt Beverage Brand Popularity Among Youth and Youth-Appealing Advertising Content.
Xuan, Ziming; DeJong, William; Siegel, Michael; Babor, Thomas F
2017-11-01
This study examined whether alcohol brands more popular among youth are more likely to have aired television advertisements that violated the alcohol industry's voluntary code by including youth-appealing content. We obtained a complete list of 288 brand-specific beer advertisements broadcast during the National Collegiate Athletic Association (NCAA) men's and women's basketball tournaments from 1999 to 2008. All ads were rated by a panel of health professionals using a modified Delphi method to assess the presence of youth-appealing content in violation of the alcohol industry's voluntary code. The ads represented 23 alcohol brands. The popularity of these brands was operationalized as the brand-specific popularity of youth alcohol consumption in the past 30 days, as determined by a 2011 to 2012 national survey of underage drinkers. Brand-level popularity was used as the exposure variable to predict the odds of having advertisements with youth-appealing content violations. Accounting for other covariates and the clustering of advertisements within brands, increased brand popularity among underage youth was associated with significantly increased odds of having youth-appeal content violations in ads televised during the NCAA basketball tournament games (adjusted odds ratio = 1.70, 95% CI: 1.38, 2.09). Alcohol brands popular among underage drinkers are more likely to air television advertising that violates the industry's voluntary code which proscribes youth-appealing content. Copyright © 2017 by the Research Society on Alcoholism.
Coding “What” and “When” in the Archer Fish Retina
Vasserman, Genadiy; Shamir, Maoz; Ben Simon, Avi; Segev, Ronen
2010-01-01
Traditionally, the information content of the neural response is quantified using statistics of the responses relative to stimulus onset time with the assumption that the brain uses onset time to infer stimulus identity. However, stimulus onset time must also be estimated by the brain, making the utility of such an approach questionable. How can stimulus onset be estimated from the neural responses with sufficient accuracy to ensure reliable stimulus identification? We address this question using the framework of colour coding by the archer fish retinal ganglion cell. We found that stimulus identity, “what”, can be estimated from the responses of best single cells with an accuracy comparable to that of the animal's psychophysical estimation. However, to extract this information, an accurate estimation of stimulus onset is essential. We show that stimulus onset time, “when”, can be estimated using a linear-nonlinear readout mechanism that requires the response of a population of 100 cells. Thus, stimulus onset time can be estimated using a relatively simple readout. However, large nerve cell populations are required to achieve sufficient accuracy. PMID:21079682
NASA Astrophysics Data System (ADS)
Nitadori, Keigo; Makino, Junichiro; Hut, Piet
2006-12-01
The main performance bottleneck of gravitational N-body codes is the force calculation between two particles. We have succeeded in speeding up this pair-wise force calculation by factors between 2 and 10, depending on the code and the processor on which the code is run. These speed-ups were obtained by writing highly fine-tuned code for x86_64 microprocessors. Any existing N-body code, running on these chips, can easily incorporate our assembly code programs. In the current paper, we present an outline of our overall approach, which we illustrate with one specific example: the use of a Hermite scheme for a direct N2 type integration on a single 2.0 GHz Athlon 64 processor, for which we obtain an effective performance of 4.05 Gflops, for double-precision accuracy. In subsequent papers, we will discuss other variations, including the combinations of N log N codes, single-precision implementations, and performance on other microprocessors.
Spatial transform coding of color images.
NASA Technical Reports Server (NTRS)
Pratt, W. K.
1971-01-01
The application of the transform-coding concept to the coding of color images represented by three primary color planes of data is discussed. The principles of spatial transform coding are reviewed and the merits of various methods of color-image representation are examined. A performance analysis is presented for the color-image transform-coding system. Results of a computer simulation of the coding system are also given. It is shown that, by transform coding, the chrominance content of a color image can be coded with an average of 1.0 bits per element or less without serious degradation. If luminance coding is also employed, the average rate reduces to about 2.0 bits per element or less.
Complete plastid genome sequence of goosegrass (Eleusine indica) and comparison with other Poaceae.
Zhang, Hui; Hall, Nathan; McElroy, J Scott; Lowe, Elijah K; Goertzen, Leslie R
2017-02-05
Eleusine indica, also known as goosegrass, is a serious weed in at least 42 countries. In this paper we report the complete plastid genome sequence of goosegrass obtained by de novo assembly of paired-end and mate-paired reads generated by Illumina sequencing of total genomic DNA. The goosegrass plastome is a circular molecule of 135,151bp in length, consisting of two single-copy regions separated by a pair of inverted repeats (IRs) of 20,919 bases. The large (LSC) and the small (SSC) single-copy regions span 80,667 bases and 12,646 bases, respectively. The plastome of goosegrass has 38.19% GC content and includes 108 unique genes, of which 76 are protein-coding, 28 are transfer RNA, and 4 are ribosomal RNA. The goosegrass plastome sequence was compared to eight other species of Poaceae. Although generally conserved with respect to Poaceae, this genomic resource will be useful for evolutionary studies within this weed species and the genus Eleusine. Copyright © 2016. Published by Elsevier B.V.
Neural signatures of attention: insights from decoding population activity patterns.
Sapountzis, Panagiotis; Gregoriou, Georgia G
2018-01-01
Understanding brain function and the computations that individual neurons and neuronal ensembles carry out during cognitive functions is one of the biggest challenges in neuroscientific research. To this end, invasive electrophysiological studies have provided important insights by recording the activity of single neurons in behaving animals. To average out noise, responses are typically averaged across repetitions and across neurons that are usually recorded on different days. However, the brain makes decisions on short time scales based on limited exposure to sensory stimulation by interpreting responses of populations of neurons on a moment to moment basis. Recent studies have employed machine-learning algorithms in attention and other cognitive tasks to decode the information content of distributed activity patterns across neuronal ensembles on a single trial basis. Here, we review results from studies that have used pattern-classification decoding approaches to explore the population representation of cognitive functions. These studies have offered significant insights into population coding mechanisms. Moreover, we discuss how such advances can aid the development of cognitive brain-computer interfaces.
Experimental implementation of the Bacon-Shor code with 10 entangled photons
NASA Astrophysics Data System (ADS)
Gimeno-Segovia, Mercedes; Sanders, Barry C.
The number of qubits that can be effectively controlled in quantum experiments is growing, reaching a regime where small quantum error-correcting codes can be tested. The Bacon-Shor code is a simple quantum code that protects against the effect of an arbitrary single-qubit error. In this work, we propose an experimental implementation of said code in a post-selected linear optical setup, similar to the recently reported 10-photon GHZ generation experiment. In the procedure we propose, an arbitrary state is encoded into the protected Shor code subspace, and after undergoing a controlled single-qubit error, is successfully decoded. BCS appreciates financial support from Alberta Innovates, NSERC, China's 1000 Talent Plan and the Institute for Quantum Information and Matter, which is an NSF Physics Frontiers Center(NSF Grant PHY-1125565) with support of the Moore Foundation(GBMF-2644).
NASA Astrophysics Data System (ADS)
Trejos, Sorayda; Fredy Barrera, John; Torroba, Roberto
2015-08-01
We present for the first time an optical encrypting-decrypting protocol for recovering messages without speckle noise. This is a digital holographic technique using a 2f scheme to process QR codes entries. In the procedure, letters used to compose eventual messages are individually converted into a QR code, and then each QR code is divided into portions. Through a holographic technique, we store each processed portion. After filtering and repositioning, we add all processed data to create a single pack, thus simplifying the handling and recovery of multiple QR code images, representing the first multiplexing procedure applied to processed QR codes. All QR codes are recovered in a single step and in the same plane, showing neither cross-talk nor noise problems as in other methods. Experiments have been conducted using an interferometric configuration and comparisons between unprocessed and recovered QR codes have been performed, showing differences between them due to the involved processing. Recovered QR codes can be successfully scanned, thanks to their noise tolerance. Finally, the appropriate sequence in the scanning of the recovered QR codes brings a noiseless retrieved message. Additionally, to procure maximum security, the multiplexed pack could be multiplied by a digital diffuser as to encrypt it. The encrypted pack is easily decoded by multiplying the multiplexing with the complex conjugate of the diffuser. As it is a digital operation, no noise is added. Therefore, this technique is threefold robust, involving multiplexing, encryption, and the need of a sequence to retrieve the outcome.
Channel coding for underwater acoustic single-carrier CDMA communication system
NASA Astrophysics Data System (ADS)
Liu, Lanjun; Zhang, Yonglei; Zhang, Pengcheng; Zhou, Lin; Niu, Jiong
2017-01-01
CDMA is an effective multiple access protocol for underwater acoustic networks, and channel coding can effectively reduce the bit error rate (BER) of the underwater acoustic communication system. For the requirements of underwater acoustic mobile networks based on CDMA, an underwater acoustic single-carrier CDMA communication system (UWA/SCCDMA) based on the direct-sequence spread spectrum is proposed, and its channel coding scheme is studied based on convolution, RA, Turbo and LDPC coding respectively. The implementation steps of the Viterbi algorithm of convolutional coding, BP and minimum sum algorithms of RA coding, Log-MAP and SOVA algorithms of Turbo coding, and sum-product algorithm of LDPC coding are given. An UWA/SCCDMA simulation system based on Matlab is designed. Simulation results show that the UWA/SCCDMA based on RA, Turbo and LDPC coding have good performance such that the communication BER is all less than 10-6 in the underwater acoustic channel with low signal to noise ratio (SNR) from -12 dB to -10dB, which is about 2 orders of magnitude lower than that of the convolutional coding. The system based on Turbo coding with Log-MAP algorithm has the best performance.
NASA World Wind: A New Mission
NASA Astrophysics Data System (ADS)
Hogan, P.; Gaskins, T.; Bailey, J. E.
2008-12-01
Virtual Globes are well into their first generation, providing increasingly rich and beautiful visualization of more types and quantities of information. However, they are still mostly single and proprietary programs, akin to a web browser whose content and functionality are controlled and constrained largely by the browser's manufacturer. Today Google and Microsoft determine what we can and cannot see and do in these programs. NASA World Wind started out in nearly the same mode, a single program with limited functionality and information content. But as the possibilities of virtual globes became more apparent, we found that while enabling a new class of information visualization, we were also getting in the way. Many users want to provide World Wind functionality and information in their programs, not ours. They want it in their web pages. They want to include their own features. They told us that only with this kind of flexibility, could their objectives and the potential of the technology be truly realized. World Wind therefore changed its mission: from providing a single information browser to enabling a whole class of 3D geographic applications. Instead of creating one program, we create components to be used in any number of programs. World Wind is NASA open source software. With the source code being fully visible, anyone can readily use it and freely extend it to serve any use. Imagery and other information provided by the World Wind servers is also free and unencumbered, including the server technology to deliver geospatial data. World Wind developers can therefore provide exclusive and custom solutions based on user needs.
Cimino, James J.; Ayres, Elaine J.; Remennik, Lyubov; Rath, Sachi; Freedman, Robert; Beri, Andrea; Chen, Yang; Huser, Vojtech
2013-01-01
The US National Institutes of Health (NIH) has developed the Biomedical Translational Research Information System (BTRIS) to support researchers’ access to translational and clinical data. BTRIS includes a data repository, a set of programs for loading data from NIH electronic health records and research data management systems, an ontology for coding the disparate data with a single terminology, and a set of user interface tools that provide access to identified data from individual research studies and data across all studies from which individually identifiable data have been removed. This paper reports on unique design elements of the system, progress to date and user experience after five years of development and operation. PMID:24262893
SHIPMENT OF TWO DOE-STD-3013 CONTAINERS IN A 9977 TYPE B PACKAGE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abramczyk, G.; Bellamy, S.; Loftin, B.
2011-06-06
The 9977 is a certified Type B Packaging authorized to ship uranium and plutonium in metal and oxide forms. Historically, the standard container for these materials has been the DOE-STD-3013 which was specifically designed for the long term storage of plutonium bearing materials. The Department of Energy has used the 9975 Packaging containing a single 3013 container for the transportation and storage of these materials. In order to reduce container, shipping, and storage costs, the 9977 Packaging is being certified for transportation and storage of two 3013 containers. The challenges and risks of this content and the 9977s ability tomore » meet the Code of Federal Regulations for the transport of these materials are presented.« less
Content Representation in the Human Medial Temporal Lobe
Liang, Jackson C.; Wagner, Anthony D.
2013-01-01
Current theories of medial temporal lobe (MTL) function focus on event content as an important organizational principle that differentiates MTL subregions. Perirhinal and parahippocampal cortices may play content-specific roles in memory, whereas hippocampal processing is alternately hypothesized to be content specific or content general. Despite anatomical evidence for content-specific MTL pathways, empirical data for content-based MTL subregional dissociations are mixed. Here, we combined functional magnetic resonance imaging with multiple statistical approaches to characterize MTL subregional responses to different classes of novel event content (faces, scenes, spoken words, sounds, visual words). Univariate analyses revealed that responses to novel faces and scenes were distributed across the anterior–posterior axis of MTL cortex, with face responses distributed more anteriorly than scene responses. Moreover, multivariate pattern analyses of perirhinal and parahippocampal data revealed spatially organized representational codes for multiple content classes, including nonpreferred visual and auditory stimuli. In contrast, anterior hippocampal responses were content general, with less accurate overall pattern classification relative to MTL cortex. Finally, posterior hippocampal activation patterns consistently discriminated scenes more accurately than other forms of content. Collectively, our findings indicate differential contributions of MTL subregions to event representation via a distributed code along the anterior–posterior axis of MTL that depends on the nature of event content. PMID:22275474
McGill, Susan E; Barker, Daniel
2017-07-20
" Candidatus Ruthia magnifica", "Candidatus Vesicomyosocius okutanii" and Thiomicrospira crunogena are all sulfur-oxidising bacteria found in deep-sea vent environments. Recent research suggests that the two symbiotic organisms, "Candidatus R. magnifica" and "Candidatus V. okutanii", may share common ancestry with the autonomously living species T. crunogena. We used comparative genomics to examine the genome-wide protein-coding content of all three species to explore their similarities. In particular, we used the OrthoMCL algorithm to sort proteins into groups of putative orthologs on the basis of sequence similarity. The OrthoMCL inflation parameter was tuned using biological criteria. Using the tuned value, OrthoMCL delimited 1070 protein groups. 63.5% of these groups contained one protein from each species. Two groups contained duplicate protein copies from all three species. 123 groups were unique to T. crunogena and ten groups included multiple copies of T. crunogena proteins but only single copies from the other species. "Candidatus R. magnifica" had one unique group, and had multiple copies in one group where the other species had a single copy. There were no groups unique to "Candidatus V. okutanii", and no groups in which there were multiple "Candidatus V. okutanii" proteins but only single proteins from the other species. Results align with previous suggestions that all three species share a common ancestor. However this is not definitive evidence to make taxonomic conclusions and the possibility of horizontal gene transfer was not investigated. Methodologically, the tuning of the OrthoMCL inflation parameter using biological criteria provides further methods to refine the OrthoMCL procedure.
Reduction of PAPR in coded OFDM using fast Reed-Solomon codes over prime Galois fields
NASA Astrophysics Data System (ADS)
Motazedi, Mohammad Reza; Dianat, Reza
2017-02-01
In this work, two new techniques using Reed-Solomon (RS) codes over GF(257) and GF(65,537) are proposed for peak-to-average power ratio (PAPR) reduction in coded orthogonal frequency division multiplexing (OFDM) systems. The lengths of these codes are well-matched to the length of OFDM frames. Over these fields, the block lengths of codes are powers of two and we fully exploit the radix-2 fast Fourier transform algorithms. Multiplications and additions are simple modulus operations. These codes provide desirable randomness with a small perturbation in information symbols that is essential for generation of different statistically independent candidates. Our simulations show that the PAPR reduction ability of RS codes is the same as that of conventional selected mapping (SLM), but contrary to SLM, we can get error correction capability. Also for the second proposed technique, the transmission of side information is not needed. To the best of our knowledge, this is the first work using RS codes for PAPR reduction in single-input single-output systems.
NAEYC Code of Ethical Conduct. Revised = Codigo de Conducta Etica. Revisada
ERIC Educational Resources Information Center
National Association of Elementary School Principals (NAESP), 2005
2005-01-01
This document presents a code of ethics for early childhood educators that offers guidelines for responsible behavior and sets forth a common basis for resolving ethical dilemmas encountered in early education. It represents the English and Spanish versions of the revised code. Its contents were approved by the NAEYC Governing Board in April 2005…
A Computer Program for Flow-Log Analysis of Single Holes (FLASH)
Day-Lewis, F. D.; Johnson, C.D.; Paillet, Frederick L.; Halford, K.J.
2011-01-01
A new computer program, FLASH (Flow-Log Analysis of Single Holes), is presented for the analysis of borehole vertical flow logs. The code is based on an analytical solution for steady-state multilayer radial flow to a borehole. The code includes options for (1) discrete fractures and (2) multilayer aquifers. Given vertical flow profiles collected under both ambient and stressed (pumping or injection) conditions, the user can estimate fracture (or layer) transmissivities and far-field hydraulic heads. FLASH is coded in Microsoft Excel with Visual Basic for Applications routines. The code supports manual and automated model calibration. ?? 2011, The Author(s). Ground Water ?? 2011, National Ground Water Association.
A coded tracking telemetry system
Howey, P.W.; Seegar, W.S.; Fuller, M.R.; Titus, K.; Amlaner, Charles J.
1989-01-01
We describe the general characteristics of an automated radio telemetry system designed to operate for prolonged periods on a single frequency. Each transmitter sends a unique coded signal to a receiving system that encodes and records only the appropriater, pre-programmed codes. A record of the time of each reception is stored on diskettes in a micro-computer. This system enables continuous monitoring of infrequent signals (e.g. one per minute or one per hour), thus extending operation life or allowing size reduction of the transmitter, compared to conventional wildlife telemetry. Furthermore, when using unique codes transmitted on a single frequency, biologists can monitor many individuals without exceeding the radio frequency allocations for wildlife.
14 CFR 201.4 - General provisions concerning contents.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false General provisions concerning contents. 201.4 Section 201.4 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT OF TRANSPORTATION... UNITED STATES CODE-[AMENDED] Application Procedures § 201.4 General provisions concerning contents. (a...
Scene-aware joint global and local homographic video coding
NASA Astrophysics Data System (ADS)
Peng, Xiulian; Xu, Jizheng; Sullivan, Gary J.
2016-09-01
Perspective motion is commonly represented in video content that is captured and compressed for various applications including cloud gaming, vehicle and aerial monitoring, etc. Existing approaches based on an eight-parameter homography motion model cannot deal with this efficiently, either due to low prediction accuracy or excessive bit rate overhead. In this paper, we consider the camera motion model and scene structure in such video content and propose a joint global and local homography motion coding approach for video with perspective motion. The camera motion is estimated by a computer vision approach, and camera intrinsic and extrinsic parameters are globally coded at the frame level. The scene is modeled as piece-wise planes, and three plane parameters are coded at the block level. Fast gradient-based approaches are employed to search for the plane parameters for each block region. In this way, improved prediction accuracy and low bit costs are achieved. Experimental results based on the HEVC test model show that up to 9.1% bit rate savings can be achieved (with equal PSNR quality) on test video content with perspective motion. Test sequences for the example applications showed a bit rate savings ranging from 3.7 to 9.1%.
Coding visual features extracted from video sequences.
Baroffio, Luca; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano
2014-05-01
Visual features are successfully exploited in several applications (e.g., visual search, object recognition and tracking, etc.) due to their ability to efficiently represent image content. Several visual analysis tasks require features to be transmitted over a bandwidth-limited network, thus calling for coding techniques to reduce the required bit budget, while attaining a target level of efficiency. In this paper, we propose, for the first time, a coding architecture designed for local features (e.g., SIFT, SURF) extracted from video sequences. To achieve high coding efficiency, we exploit both spatial and temporal redundancy by means of intraframe and interframe coding modes. In addition, we propose a coding mode decision based on rate-distortion optimization. The proposed coding scheme can be conveniently adopted to implement the analyze-then-compress (ATC) paradigm in the context of visual sensor networks. That is, sets of visual features are extracted from video frames, encoded at remote nodes, and finally transmitted to a central controller that performs visual analysis. This is in contrast to the traditional compress-then-analyze (CTA) paradigm, in which video sequences acquired at a node are compressed and then sent to a central unit for further processing. In this paper, we compare these coding paradigms using metrics that are routinely adopted to evaluate the suitability of visual features in the context of content-based retrieval, object recognition, and tracking. Experimental results demonstrate that, thanks to the significant coding gains achieved by the proposed coding scheme, ATC outperforms CTA with respect to all evaluation metrics.
NASA Astrophysics Data System (ADS)
Hanhart, Philippe; Řeřábek, Martin; Ebrahimi, Touradj
2015-09-01
This paper reports the details and results of the subjective evaluations conducted at EPFL to evaluate the responses to the Call for Evidence (CfE) for High Dynamic Range (HDR) and Wide Color Gamut (WCG) Video Coding issued by Moving Picture Experts Group (MPEG). The CfE on HDR/WCG Video Coding aims to explore whether the coding efficiency and/or the functionality of the current version of HEVC standard can be signi_cantly improved for HDR and WCG content. In total, nine submissions, five for Category 1 and four for Category 3a, were compared to the HEVC Main 10 Profile based Anchor. More particularly, five HDR video contents, compressed at four bit rates by each proponent responding to the CfE, were used in the subjective evaluations. Further, the side-by-side presentation methodology was used for the subjective experiment to discriminate small differences between the Anchor and proponents. Subjective results shows that the proposals provide evidence that the coding efficiency can be improved in a statistically noticeable way over MPEG CfE Anchors in terms of perceived quality within the investigated content. The paper further benchmarks the selected objective metrics based on their correlations with the subjective ratings. It is shown that PSNR-DE1000, HDRVDP- 2, and PSNR-Lx can reliably detect visible differences between the proposed encoding solutions and current HEVC standard.
Li, Yun Bo; Li, Lian Lin; Xu, Bai Bing; Wu, Wei; Wu, Rui Yuan; Wan, Xiang; Cheng, Qiang; Cui, Tie Jun
2016-01-01
The programmable and digital metamaterials or metasurfaces presented recently have huge potentials in designing real-time-controlled electromagnetic devices. Here, we propose the first transmission-type 2-bit programmable coding metasurface for single-sensor and single- frequency imaging in the microwave frequency. Compared with the existing single-sensor imagers composed of active spatial modulators with their units controlled independently, we introduce randomly programmable metasurface to transform the masks of modulators, in which their rows and columns are controlled simultaneously so that the complexity and cost of the imaging system can be reduced drastically. Different from the single-sensor approach using the frequency agility, the proposed imaging system makes use of variable modulators under single frequency, which can avoid the object dispersion. In order to realize the transmission-type 2-bit programmable metasurface, we propose a two-layer binary coding unit, which is convenient for changing the voltages in rows and columns to switch the diodes in the top and bottom layers, respectively. In our imaging measurements, we generate the random codes by computer to achieve different transmission patterns, which can support enough multiple modes to solve the inverse-scattering problem in the single-sensor imaging. Simple experimental results are presented in the microwave frequency, validating our new single-sensor and single-frequency imaging system. PMID:27025907
Li, Yun Bo; Li, Lian Lin; Xu, Bai Bing; Wu, Wei; Wu, Rui Yuan; Wan, Xiang; Cheng, Qiang; Cui, Tie Jun
2016-03-30
The programmable and digital metamaterials or metasurfaces presented recently have huge potentials in designing real-time-controlled electromagnetic devices. Here, we propose the first transmission-type 2-bit programmable coding metasurface for single-sensor and single- frequency imaging in the microwave frequency. Compared with the existing single-sensor imagers composed of active spatial modulators with their units controlled independently, we introduce randomly programmable metasurface to transform the masks of modulators, in which their rows and columns are controlled simultaneously so that the complexity and cost of the imaging system can be reduced drastically. Different from the single-sensor approach using the frequency agility, the proposed imaging system makes use of variable modulators under single frequency, which can avoid the object dispersion. In order to realize the transmission-type 2-bit programmable metasurface, we propose a two-layer binary coding unit, which is convenient for changing the voltages in rows and columns to switch the diodes in the top and bottom layers, respectively. In our imaging measurements, we generate the random codes by computer to achieve different transmission patterns, which can support enough multiple modes to solve the inverse-scattering problem in the single-sensor imaging. Simple experimental results are presented in the microwave frequency, validating our new single-sensor and single-frequency imaging system.
Biometrics encryption combining palmprint with two-layer error correction codes
NASA Astrophysics Data System (ADS)
Li, Hengjian; Qiu, Jian; Dong, Jiwen; Feng, Guang
2017-07-01
To bridge the gap between the fuzziness of biometrics and the exactitude of cryptography, based on combining palmprint with two-layer error correction codes, a novel biometrics encryption method is proposed. Firstly, the randomly generated original keys are encoded by convolutional and cyclic two-layer coding. The first layer uses a convolution code to correct burst errors. The second layer uses cyclic code to correct random errors. Then, the palmprint features are extracted from the palmprint images. Next, they are fused together by XORing operation. The information is stored in a smart card. Finally, the original keys extraction process is the information in the smart card XOR the user's palmprint features and then decoded with convolutional and cyclic two-layer code. The experimental results and security analysis show that it can recover the original keys completely. The proposed method is more secure than a single password factor, and has higher accuracy than a single biometric factor.
Shielding from space radiations
NASA Technical Reports Server (NTRS)
Chang, C. Ken; Badavi, Forooz F.; Tripathi, Ram K.
1993-01-01
This Progress Report covering the period of December 1, 1992 to June 1, 1993 presents the development of an analytical solution to the heavy ion transport equation in terms of Green's function formalism. The mathematical development results are recasted into a highly efficient computer code for space applications. The efficiency of this algorithm is accomplished by a nonperturbative technique of extending the Green's function over the solution domain. The code may also be applied to accelerator boundary conditions to allow code validation in laboratory experiments. Results from the isotopic version of the code with 59 isotopes present for a single layer target material, for the case of an iron beam projectile at 600 MeV/nucleon in water is presented. A listing of the single layer isotopic version of the code is included.
Concurrent error detecting codes for arithmetic processors
NASA Technical Reports Server (NTRS)
Lim, R. S.
1979-01-01
A method of concurrent error detection for arithmetic processors is described. Low-cost residue codes with check-length l and checkbase m = 2 to the l power - 1 are described for checking arithmetic operations of addition, subtraction, multiplication, division complement, shift, and rotate. Of the three number representations, the signed-magnitude representation is preferred for residue checking. Two methods of residue generation are described: the standard method of using modulo m adders and the method of using a self-testing residue tree. A simple single-bit parity-check code is described for checking the logical operations of XOR, OR, and AND, and also the arithmetic operations of complement, shift, and rotate. For checking complement, shift, and rotate, the single-bit parity-check code is simpler to implement than the residue codes.
Moving Controlled Vocabularies into the Semantic Web
NASA Astrophysics Data System (ADS)
Thomas, R.; Lowry, R. K.; Kokkinaki, A.
2015-12-01
One of the issues with legacy oceanographic data formats is that the only tool available for describing what a measurement is and how it was made is a single metadata tag known as the parameter code. The British Oceanographic Data Centre (BODC) has been supporting the international oceanographic community gain maximum benefit from this through a controlled vocabulary known as the BODC Parameter Usage Vocabulary (PUV). Over time this has grown to over 34,000 entries some of which have preferred labels with over 400 bytes of descriptive information detailing what was measured and how. A decade ago the BODC pioneered making this information available in a more useful form with the implementation of a prototype vocabulary server (NVS) that referenced each 'parameter code' as a URL. This developed into the current server (NVS V2) in which the parameter URL resolves into an RDF document based on the SKOS data model which includes a list of resource URLs mapped to the 'parameter'. For example the parameter code for a contaminant in biota, such as 'cadmium in Mytilus edulis', carries RDF triples leading to the entry for Mytilus edulis in the WoRMS and for cadmium in the ChEBI ontologies. By providing links into these external ontologies the information captured in a 1980s parameter code now conforms to the Linked Data paradigm of the Semantic Web, vastly increasing the descriptive information accessible to a user. This presentation will describe the next steps along the road to the Semantic Web with the development of a SPARQL end point1 to expose the PUV plus the 190 other controlled vocabularies held in NVS. Whilst this is ideal for those fluent in SPARQL, most users require something a little more user-friendly and so the NVS browser2 was developed over the end point to allow less technical users to query the vocabularies and navigate the NVS ontology. This tool integrates into an editor that allows vocabulary content to be manipulated by authorised users outside BODC. Having placed Linked Data tooling over a single SPARQL end point the obvious future development for this system is to support semantic interoperability outside NVS by the incorporation of federated SPARQL end points in the USA and Australia during the ODIP II project. 1https://vocab.nerc.ac.uk/sparql 2 https://www.bodc.ac.uk/data/codes_and_formats/vocabulary_search/
White, Casey B.; Moyer, Cheryl A.; Stern, David T.; Katz, Steven J.
2004-01-01
Objective: E-mail use in the clinical setting has been slow to diffuse for several reasons, including providers' concerns about patients' inappropriate and inefficient use of the technology. This study examined the content of a random sample of patient–physician e-mail messages to determine the validity of those concerns. Design: A qualitative analysis of patient–physician e-mail messages was performed. Measurements: A total of 3,007 patient–physician e-mail messages were collected over 11 months as part of a randomized, controlled trial of a triage-based e-mail system in two primary care centers (including 98 physicians); 10% of messages were randomly selected for review. Messages were coded across such domains as message type, number of requests per e-mail, inclusion of sensitive content, necessity of a physician response, and message tone. Results: The majority (82.8%) of messages addressed a single issue. The most common message types included information updates to the physicians (41.4%), prescription renewals (24.2%), health questions (13.2%), questions about test results (10.9%), referrals (8.8%), “other” (including thank yous, apologies) (8.8%), appointments (5.4%), requests for non-health-related information (4.8%), and billing questions (0.3%). Overall, messages were concise, formal, and medically relevant. Very few (5.1%) included sensitive content, and none included urgent messages. Less than half (43.2%) required a physician response. Conclusion: A triage-based e-mail system promoted e-mail exchanges appropriate for primary care. Most patients adhered to guidelines aimed at focusing content, limiting the number of requests per message, and avoiding urgent requests or highly sensitive content. Thus, physicians' concerns about the content of patients' e-mails may be unwarranted. PMID:15064295
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simonen, F.A.; Khaleel, M.A.
This paper describes a statistical evaluation of the through-thickness copper variation for welds in reactor pressure vessels, and reviews the historical basis for the static and arrest fracture toughness (K{sub Ic} and K{sub Ia}) equations used in the VISA-II code. Copper variability in welds is due to fabrication procedures with copper contents being randomly distributed, variable from one location to another through the thickness of the vessel. The VISA-II procedure of sampling the copper content from a statistical distribution for every 6.35- to 12.7-mm (1/4- to 1/2-in.) layer through the thickness was found to be consistent with the statistical observations.more » However, the parameters of the VISA-II distribution and statistical limits required further investigation. Copper contents at few locations through the thickness were found to exceed the 0.4% upper limit of the VISA-II code. The data also suggest that the mean copper content varies systematically through the thickness. While, the assumption of normality is not clearly supported by the available data, a statistical evaluation based on all the available data results in mean and standard deviations within the VISA-II code limits.« less
Ahmad, Muneer; Jung, Low Tan; Bhuiyan, Al-Amin
2017-10-01
Digital signal processing techniques commonly employ fixed length window filters to process the signal contents. DNA signals differ in characteristics from common digital signals since they carry nucleotides as contents. The nucleotides own genetic code context and fuzzy behaviors due to their special structure and order in DNA strand. Employing conventional fixed length window filters for DNA signal processing produce spectral leakage and hence results in signal noise. A biological context aware adaptive window filter is required to process the DNA signals. This paper introduces a biological inspired fuzzy adaptive window median filter (FAWMF) which computes the fuzzy membership strength of nucleotides in each slide of window and filters nucleotides based on median filtering with a combination of s-shaped and z-shaped filters. Since coding regions cause 3-base periodicity by an unbalanced nucleotides' distribution producing a relatively high bias for nucleotides' usage, such fundamental characteristic of nucleotides has been exploited in FAWMF to suppress the signal noise. Along with adaptive response of FAWMF, a strong correlation between median nucleotides and the Π shaped filter was observed which produced enhanced discrimination between coding and non-coding regions contrary to fixed length conventional window filters. The proposed FAWMF attains a significant enhancement in coding regions identification i.e. 40% to 125% as compared to other conventional window filters tested over more than 250 benchmarked and randomly taken DNA datasets of different organisms. This study proves that conventional fixed length window filters applied to DNA signals do not achieve significant results since the nucleotides carry genetic code context. The proposed FAWMF algorithm is adaptive and outperforms significantly to process DNA signal contents. The algorithm applied to variety of DNA datasets produced noteworthy discrimination between coding and non-coding regions contrary to fixed window length conventional filters. Copyright © 2017 Elsevier B.V. All rights reserved.
On the Information Content of Program Traces
NASA Technical Reports Server (NTRS)
Frumkin, Michael; Hood, Robert; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
Program traces are used for analysis of program performance, memory utilization, and communications as well as for program debugging. The trace contains records of execution events generated by monitoring units inserted into the program. The trace size limits the resolution of execution events and restricts the user's ability to analyze the program execution. We present a study of the information content of program traces and develop a coding scheme which reduces the trace size to the limit given by the trace entropy. We apply the coding to the traces of AIMS instrumented programs executed on the IBM SPA and the SCSI Power Challenge and compare it with other coding methods. Our technique shows size of the trace can be reduced by more than a factor of 5.
Dong, Lu; Zhao, Xin; Ong, Stacie L; Harvey, Allison G
2017-10-01
The current study examined whether and which specific contents of patients' memory for cognitive therapy (CT) were associated with treatment adherence and outcome. Data were drawn from a pilot RCT of forty-eight depressed adults, who received either CT plus Memory Support Intervention (CT + Memory Support) or CT-as-usual. Patients' memory for treatment was measured using the Patient Recall Task and responses were coded into cognitive behavioral therapy (CBT) codes, such as CBT Model and Cognitive Restructuring, and non-CBT codes, such as individual coping strategies and no code. Treatment adherence was measured using therapist and patient ratings during treatment. Depression outcomes included treatment response, remission, and recurrence. Total number of CBT codes recalled was not significantly different comparing CT + Memory Support to CT-as-usual. Total CBT codes recalled were positively associated with adherence, while non-CBT codes recalled were negatively associated with adherence. Treatment responders (vs. non-responders) exhibited a significant increase in their recall of Cognitive Restructuring from session 7 to posttreatment. Greater recall of Cognitive Restructuring was marginally significantly associated with remission. Greater total number of CBT codes recalled (particularly CBT Model) was associated with non-recurrence of depression. Results highlight the important relationships between patients' memory for treatment and treatment adherence and outcome. Copyright © 2017 Elsevier Ltd. All rights reserved.
Kajikawa, Masato; Maruhashi, Tatsuya; Hidaka, Takayuki; Nakano, Yukiko; Kurisu, Satoshi; Matsumoto, Takeshi; Iwamoto, Yumiko; Kishimoto, Shinji; Matsui, Shogo; Aibara, Yoshiki; Yusoff, Farina Mohamad; Kihara, Yasuki; Chayama, Kazuaki; Goto, Chikara; Noma, Kensuke; Nakashima, Ayumu; Watanabe, Takuya; Tone, Hiroshi; Hibi, Masanobu; Osaki, Noriko; Katsuragi, Yoshihisa; Higashi, Yukihito
2018-01-12
The purpose of this study was to evaluate acute effects of coffee with a high content of chlorogenic acids and different hydroxyhydroquinone contents on postprandial endothelial dysfunction. This was a single-blind, randomized, placebo-controlled, crossover-within-subject clinical trial. A total of 37 patients with borderline or stage 1 hypertension were randomized to two study groups. The participants consumed a test meal with a single intake of the test coffee. Subjects in the Study 1 group were randomized to single intake of coffee with a high content of chlorogenic acids and low content of hydroxyhydroquinone or coffee with a high content of chlorogenic acids and a high content of hydroxyhydroquinone with crossover. Subjects in the Study 2 group were randomized to single intake of coffee with a high content of chlorogenic acids and low content of hydroxyhydroquinone or placebo coffee with crossover. Endothelial function assessed by flow-mediated vasodilation and plasma concentration of 8-isoprostanes were measured at baseline and at 1 and 2 h after coffee intake. Compared with baseline values, single intake of coffee with a high content of chlorogenic acids and low content of hydroxyhydroquinone, but not coffee with a high content of chlorogenic acids and high content of hydroxyhydroquinone or placebo coffee, significantly improved postprandial flow-mediated vasodilation and decreased circulating 8-isoprostane levels. These findings suggest that a single intake of coffee with a high content of chlorogenic acids and low content of hydroxyhydroquinone is effective for improving postprandial endothelial dysfunction. URL for Clinical Trial: https://upload.umin.ac.jp ; Registration Number for Clinical Trial: UMIN000013283.
ERIC Educational Resources Information Center
Foster, Catherine; McMenemy, David
2012-01-01
Thirty-six ethical codes from national professional associations were studied, the aim to test whether librarians have global shared values or if political and cultural contexts have significantly influenced the codes' content. Gorman's eight core values of stewardship, service, intellectual freedom, rationalism, literacy and learning, equity of…
Kwag, Jeehyun; Jang, Hyun Jae; Kim, Mincheol; Lee, Sujeong
2014-01-01
Rate and phase codes are believed to be important in neural information processing. Hippocampal place cells provide a good example where both coding schemes coexist during spatial information processing. Spike rate increases in the place field, whereas spike phase precesses relative to the ongoing theta oscillation. However, what intrinsic mechanism allows for a single neuron to generate spike output patterns that contain both neural codes is unknown. Using dynamic clamp, we simulate an in vivo-like subthreshold dynamics of place cells to in vitro CA1 pyramidal neurons to establish an in vitro model of spike phase precession. Using this in vitro model, we show that membrane potential oscillation (MPO) dynamics is important in the emergence of spike phase codes: blocking the slowly activating, non-inactivating K+ current (IM), which is known to control subthreshold MPO, disrupts MPO and abolishes spike phase precession. We verify the importance of adaptive IM in the generation of phase codes using both an adaptive integrate-and-fire and a Hodgkin–Huxley (HH) neuron model. Especially, using the HH model, we further show that it is the perisomatically located IM with slow activation kinetics that is crucial for the generation of phase codes. These results suggest an important functional role of IM in single neuron computation, where IM serves as an intrinsic mechanism allowing for dual rate and phase coding in single neurons. PMID:25100320
NASA Technical Reports Server (NTRS)
Westra, Douglas G.; Lin, Jeff; West, Jeff; Tucker, Kevin
2006-01-01
This document is a viewgraph presentation of a paper that documents a continuing effort at Marshall Space Flight Center (MSFC) to use, assess, and continually improve CFD codes to the point of material utility in the design of rocket engine combustion devices. This paper describes how the code is presently being used to simulate combustion in a single element combustion chamber with shear coaxial injectors using gaseous oxygen and gaseous hydrogen propellants. The ultimate purpose of the efforts documented is to assess and further improve the Loci-CHEM code and the implementation of it. Single element shear coaxial injectors were tested as part of the Staged Combustion Injector Technology (SCIT) program, where detailed chamber wall heat fluxes were measured. Data was taken over a range of chamber pressures for propellants injected at both ambient and elevated temperatures. Several test cases are simulated as part of the effort to demonstrate use of the Loci-CHEM CFD code and to enable us to make improvements in the code as needed. The simulations presented also include a grid independence study on hybrid grids. Several two-equation eddy viscosity low Reynolds number turbulence models are also evaluated as part of the study. All calculations are presented with a comparison to the experimental data. Weaknesses of the code relative to test data are discussed and continuing efforts to improve the code are presented.
van der Mei, Sijrike F; Dijkers, Marcel P J M; Heerkens, Yvonne F
2011-12-01
To examine to what extent the concept and the domains of participation as defined in the International Classification of Functioning, Disability and Health (ICF) are represented in general cancer-specific health-related quality of life (HRQOL) instruments. Using the ICF linking rules, two coders independently extracted the meaningful concepts of ten instruments and linked these to ICF codes. The proportion of concepts that could be linked to ICF codes ranged from 68 to 95%. Although all instruments contained concepts linked to Participation (Chapters d7-d9 of the classification of 'Activities and Participation'), the instruments covered only a small part of all available ICF codes. The proportion of ICF codes in the instruments that were participation related ranged from 3 to 35%. 'Major life areas' (d8) was the most frequently used Participation Chapter, with d850 'remunerative employment' as the most used ICF code. The number of participation-related ICF codes covered in the instruments is limited. General cancer-specific HRQOL instruments only assess social life of cancer patients to a limited degree. This study's information on the content of these instruments may guide researchers in selecting the appropriate instrument for a specific research purpose.
Welding consumable selection for cryogenic (4{degrees}K) application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kane, S.F.; Siewert, T.A.
1994-12-31
Brookhaven National Laboratory (BNL) has begun construction of a large (3.8 kilometer circumference) heavy ion collider for the Department of Energy. The collider uses superconducting magnets, operating at 4{degrees}K in supercritical helium, which meets the definition of a pressure vessel. The ASME Boiler & Pressure Vessel Code grants an exemption from impact testing to certain metals, but only for operating temperatures down to 20{degrees}K. Research and the latest change to ASTM Standard E23 have invalidated Charpy testing at 4{degrees}K, thus compliance with the Code is not possible. This effort was undertaken to identify the weld process and weld material necessarymore » to comply with the intent of the Code (impact test) requirements, that is, to design a weld joint that will assure adequate fracture toughness. We will report the results of this development and testing, and conclude that nitrogen and maganese enhanced 385L provides a superior weld metal for 4{degrees}K cryogenic applications without the exaggerated purity concerns normally associated with superaustenitic weld materials. This development has been so successful that BNL has procured 15,000 pounds of this material for magnet production. Oxygen content, manifested as inclusion density, has the single most significant effect upon fracture toughness and impact strength. Finally, we report that GMAW is a viable welding process, using off-the-shelf equipment, for 4{degrees}K cryogenic applications.« less
Multiplexed Detection of Cytokines Based on Dual Bar-Code Strategy and Single-Molecule Counting.
Li, Wei; Jiang, Wei; Dai, Shuang; Wang, Lei
2016-02-02
Cytokines play important roles in the immune system and have been regarded as biomarkers. While single cytokine is not specific and accurate enough to meet the strict diagnosis in practice, in this work, we constructed a multiplexed detection method for cytokines based on dual bar-code strategy and single-molecule counting. Taking interferon-γ (IFN-γ) and tumor necrosis factor-α (TNF-α) as model analytes, first, the magnetic nanobead was functionalized with the second antibody and primary bar-code strands, forming a magnetic nanoprobe. Then, through the specific reaction of the second antibody and the antigen that fixed by the primary antibody, sandwich-type immunocomplex was formed on the substrate. Next, the primary bar-code strands as amplification units triggered multibranched hybridization chain reaction (mHCR), producing nicked double-stranded polymers with multiple branched arms, which were served as secondary bar-code strands. Finally, the secondary bar-code strands hybridized with the multimolecule labeled fluorescence probes, generating enhanced fluorescence signals. The numbers of fluorescence dots were counted one by one for quantification with epi-fluorescence microscope. By integrating the primary and secondary bar-code-based amplification strategy and the multimolecule labeled fluorescence probes, this method displayed an excellent sensitivity with the detection limits were both 5 fM. Unlike the typical bar-code assay that the bar-code strands should be released and identified on a microarray, this method is more direct. Moreover, because of the selective immune reaction and the dual bar-code mechanism, the resulting method could detect the two targets simultaneously. Multiple analysis in human serum was also performed, suggesting that our strategy was reliable and had a great potential application in early clinical diagnosis.
Describing the content of primary care: limitations of Canadian billing data.
Katz, Alan; Halas, Gayle; Dillon, Michael; Sloshower, Jordan
2012-02-15
Primary health care systems are designed to provide comprehensive patient care. However, the ICD 9 coding system used for billing purposes in Canada neither characterizes nor captures the scope of clinical practice or complexity of physician-patient interactions. This study aims to describe the content of primary care clinical encounters and examine the limitations of using administrative data to capture the content of these visits. Although a number of U.S studies have described the content of primary care encounters, this is the first Canadian study to do so. Study-specific data collection forms were completed by 16 primary care physicians in community health and family practice clinics in Winnipeg, Manitoba, Canada. The data collection forms were completed immediately following the patient encounter and included patient and visit characteristics, such as primary reason for visit, topics discussed, actions taken, degree of complexity as well as diagnosis and ICD-9 codes. Data was collected for 760 patient encounters. The diagnostic codes often did not reflect the dominant topic of the visit or the topic requiring the most amount of time. Physicians often address multiple problems and provide numerous services thus increasing the complexity of care. This is one of the first Canadian studies to critically analyze the content of primary care clinical encounters. The data allowed a greater understanding of primary care clinical encounters and attests to the deficiencies of singular ICD-9 coding which fails to capture the comprehensiveness and complexity of the primary care encounter. As primary care reform initiatives in the U.S and Canada attempt to transform the way family physicians deliver care, it becomes increasingly important that other tools for structuring primary care data are considered in order to help physicians, researchers and policy makers understand the breadth and complexity of primary care.
Content Coding of Psychotherapy Transcripts Using Labeled Topic Models.
Gaut, Garren; Steyvers, Mark; Imel, Zac E; Atkins, David C; Smyth, Padhraic
2017-03-01
Psychotherapy represents a broad class of medical interventions received by millions of patients each year. Unlike most medical treatments, its primary mechanisms are linguistic; i.e., the treatment relies directly on a conversation between a patient and provider. However, the evaluation of patient-provider conversation suffers from critical shortcomings, including intensive labor requirements, coder error, nonstandardized coding systems, and inability to scale up to larger data sets. To overcome these shortcomings, psychotherapy analysis needs a reliable and scalable method for summarizing the content of treatment encounters. We used a publicly available psychotherapy corpus from Alexander Street press comprising a large collection of transcripts of patient-provider conversations to compare coding performance for two machine learning methods. We used the labeled latent Dirichlet allocation (L-LDA) model to learn associations between text and codes, to predict codes in psychotherapy sessions, and to localize specific passages of within-session text representative of a session code. We compared the L-LDA model to a baseline lasso regression model using predictive accuracy and model generalizability (measured by calculating the area under the curve (AUC) from the receiver operating characteristic curve). The L-LDA model outperforms the lasso logistic regression model at predicting session-level codes with average AUC scores of 0.79, and 0.70, respectively. For fine-grained level coding, L-LDA and logistic regression are able to identify specific talk-turns representative of symptom codes. However, model performance for talk-turn identification is not yet as reliable as human coders. We conclude that the L-LDA model has the potential to be an objective, scalable method for accurate automated coding of psychotherapy sessions that perform better than comparable discriminative methods at session-level coding and can also predict fine-grained codes.
Content Coding of Psychotherapy Transcripts Using Labeled Topic Models
Gaut, Garren; Steyvers, Mark; Imel, Zac E; Atkins, David C; Smyth, Padhraic
2016-01-01
Psychotherapy represents a broad class of medical interventions received by millions of patients each year. Unlike most medical treatments, its primary mechanisms are linguistic; i.e., the treatment relies directly on a conversation between a patient and provider. However, the evaluation of patient-provider conversation suffers from critical shortcomings, including intensive labor requirements, coder error, non-standardized coding systems, and inability to scale up to larger data sets. To overcome these shortcomings, psychotherapy analysis needs a reliable and scalable method for summarizing the content of treatment encounters. We used a publicly-available psychotherapy corpus from Alexander Street press comprising a large collection of transcripts of patient-provider conversations to compare coding performance for two machine learning methods. We used the Labeled Latent Dirichlet Allocation (L-LDA) model to learn associations between text and codes, to predict codes in psychotherapy sessions, and to localize specific passages of within-session text representative of a session code. We compared the L-LDA model to a baseline lasso regression model using predictive accuracy and model generalizability (measured by calculating the area under the curve (AUC) from the receiver operating characteristic (ROC) curve). The L-LDA model outperforms the lasso logistic regression model at predicting session-level codes with average AUC scores of .79, and .70, respectively. For fine-grained level coding, L-LDA and logistic regression are able to identify specific talk-turns representative of symptom codes. However, model performance for talk-turn identification is not yet as reliable as human coders. We conclude that the L-LDA model has the potential to be an objective, scaleable method for accurate automated coding of psychotherapy sessions that performs better than comparable discriminative methods at session-level coding and can also predict fine-grained codes. PMID:26625437
Changing the Latitudes and Attitudes about Content Analysis Research
ERIC Educational Resources Information Center
Brank, Eve M.; Fox, Kathleen A.; Youstin, Tasha J.; Boeppler, Lee C.
2008-01-01
The current research employs the use of content analysis to teach research methods concepts among students enrolled in an upper division research methods course. Students coded and analyzed Jimmy Buffett song lyrics rather than using a downloadable database or collecting survey data. Students' knowledge of content analysis concepts increased after…
Error control for reliable digital data transmission and storage systems
NASA Technical Reports Server (NTRS)
Costello, D. J., Jr.; Deng, R. H.
1985-01-01
A problem in designing semiconductor memories is to provide some measure of error control without requiring excessive coding overhead or decoding time. In LSI and VLSI technology, memories are often organized on a multiple bit (or byte) per chip basis. For example, some 256K-bit DRAM's are organized in 32Kx8 bit-bytes. Byte oriented codes such as Reed Solomon (RS) codes can provide efficient low overhead error control for such memories. However, the standard iterative algorithm for decoding RS codes is too slow for these applications. In this paper we present some special decoding techniques for extended single-and-double-error-correcting RS codes which are capable of high speed operation. These techniques are designed to find the error locations and the error values directly from the syndrome without having to use the iterative alorithm to find the error locator polynomial. Two codes are considered: (1) a d sub min = 4 single-byte-error-correcting (SBEC), double-byte-error-detecting (DBED) RS code; and (2) a d sub min = 6 double-byte-error-correcting (DBEC), triple-byte-error-detecting (TBED) RS code.
Franklin, Rodney C G; Béland, Marie J; Colan, Steven D; Walters, Henry L; Aiello, Vera D; Anderson, Robert H; Bailliard, Frédérique; Boris, Jeffrey R; Cohen, Meryl S; Gaynor, J William; Guleserian, Kristine J; Houyel, Lucile; Jacobs, Marshall L; Juraszek, Amy L; Krogmann, Otto N; Kurosawa, Hiromi; Lopez, Leo; Maruszewski, Bohdan J; St Louis, James D; Seslar, Stephen P; Srivastava, Shubhika; Stellin, Giovanni; Tchervenkov, Christo I; Weinberg, Paul M; Jacobs, Jeffrey P
2017-12-01
An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many "short list" versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various "short lists". In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the "short list" for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses, children, and adults with congenital heart anomalies.
A study of thematic content in hospital mission statements: a question of values.
Williams, Jaime; Smythe, William; Hadjistavropoulos, Thomas; Malloy, David C; Martin, Ronald
2005-01-01
We examined the content of Canadian hospital mission statements using thematic content analysis. The mission statements that we studied varied in terms of both content and length. Although there was some content related to goals designed to ensure organizational visibility, survival, and competitiveness, the domain of values predominated over our entire coding structure. The primary value-related theme that emerged concerned the importance of patient care.
Refactoring the Genetic Code for Increased Evolvability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pines, Gur; Winkler, James D.; Pines, Assaf
ABSTRACT The standard genetic code is robust to mutations during transcription and translation. Point mutations are likely to be synonymous or to preserve the chemical properties of the original amino acid. Saturation mutagenesis experiments suggest that in some cases the best-performing mutant requires replacement of more than a single nucleotide within a codon. These replacements are essentially inaccessible to common error-based laboratory engineering techniques that alter a single nucleotide per mutation event, due to the extreme rarity of adjacent mutations. In this theoretical study, we suggest a radical reordering of the genetic code that maximizes the mutagenic potential of singlemore » nucleotide replacements. We explore several possible genetic codes that allow a greater degree of accessibility to the mutational landscape and may result in a hyperevolvable organism that could serve as an ideal platform for directed evolution experiments. We then conclude by evaluating the challenges of constructing such recoded organisms and their potential applications within the field of synthetic biology. IMPORTANCE The conservative nature of the genetic code prevents bioengineers from efficiently accessing the full mutational landscape of a gene via common error-prone methods. Here, we present two computational approaches to generate alternative genetic codes with increased accessibility. These new codes allow mutational transitions to a larger pool of amino acids and with a greater extent of chemical differences, based on a single nucleotide replacement within the codon, thus increasing evolvability both at the single-gene and at the genome levels. Given the widespread use of these techniques for strain and protein improvement, along with more fundamental evolutionary biology questions, the use of recoded organisms that maximize evolvability should significantly improve the efficiency of directed evolution, library generation, and fitness maximization.« less
Refactoring the Genetic Code for Increased Evolvability
Pines, Gur; Winkler, James D.; Pines, Assaf; ...
2017-11-14
ABSTRACT The standard genetic code is robust to mutations during transcription and translation. Point mutations are likely to be synonymous or to preserve the chemical properties of the original amino acid. Saturation mutagenesis experiments suggest that in some cases the best-performing mutant requires replacement of more than a single nucleotide within a codon. These replacements are essentially inaccessible to common error-based laboratory engineering techniques that alter a single nucleotide per mutation event, due to the extreme rarity of adjacent mutations. In this theoretical study, we suggest a radical reordering of the genetic code that maximizes the mutagenic potential of singlemore » nucleotide replacements. We explore several possible genetic codes that allow a greater degree of accessibility to the mutational landscape and may result in a hyperevolvable organism that could serve as an ideal platform for directed evolution experiments. We then conclude by evaluating the challenges of constructing such recoded organisms and their potential applications within the field of synthetic biology. IMPORTANCE The conservative nature of the genetic code prevents bioengineers from efficiently accessing the full mutational landscape of a gene via common error-prone methods. Here, we present two computational approaches to generate alternative genetic codes with increased accessibility. These new codes allow mutational transitions to a larger pool of amino acids and with a greater extent of chemical differences, based on a single nucleotide replacement within the codon, thus increasing evolvability both at the single-gene and at the genome levels. Given the widespread use of these techniques for strain and protein improvement, along with more fundamental evolutionary biology questions, the use of recoded organisms that maximize evolvability should significantly improve the efficiency of directed evolution, library generation, and fitness maximization.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, Srdjan
2015-02-16
CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less
Multi-phase SPH modelling of violent hydrodynamics on GPUs
NASA Astrophysics Data System (ADS)
Mokos, Athanasios; Rogers, Benedict D.; Stansby, Peter K.; Domínguez, José M.
2015-11-01
This paper presents the acceleration of multi-phase smoothed particle hydrodynamics (SPH) using a graphics processing unit (GPU) enabling large numbers of particles (10-20 million) to be simulated on just a single GPU card. With novel hardware architectures such as a GPU, the optimum approach to implement a multi-phase scheme presents some new challenges. Many more particles must be included in the calculation and there are very different speeds of sound in each phase with the largest speed of sound determining the time step. This requires efficient computation. To take full advantage of the hardware acceleration provided by a single GPU for a multi-phase simulation, four different algorithms are investigated: conditional statements, binary operators, separate particle lists and an intermediate global function. Runtime results show that the optimum approach needs to employ separate cell and neighbour lists for each phase. The profiler shows that this approach leads to a reduction in both memory transactions and arithmetic operations giving significant runtime gains. The four different algorithms are compared to the efficiency of the optimised single-phase GPU code, DualSPHysics, for 2-D and 3-D simulations which indicate that the multi-phase functionality has a significant computational overhead. A comparison with an optimised CPU code shows a speed up of an order of magnitude over an OpenMP simulation with 8 threads and two orders of magnitude over a single thread simulation. A demonstration of the multi-phase SPH GPU code is provided by a 3-D dam break case impacting an obstacle. This shows better agreement with experimental results than an equivalent single-phase code. The multi-phase GPU code enables a convergence study to be undertaken on a single GPU with a large number of particles that otherwise would have required large high performance computing resources.
Zhang, Yuqin; Lin, Fanbo; Zhang, Youyu; Li, Haitao; Zeng, Yue; Tang, Hao; Yao, Shouzhuo
2011-01-01
A new method for the detection of point mutation in DNA based on the monobase-coded cadmium tellurium nanoprobes and the quartz crystal microbalance (QCM) technique was reported. A point mutation (single-base, adenine, thymine, cytosine, and guanine, namely, A, T, C and G, mutation in DNA strand, respectively) DNA QCM sensor was fabricated by immobilizing single-base mutation DNA modified magnetic beads onto the electrode surface with an external magnetic field near the electrode. The DNA-modified magnetic beads were obtained from the biotin-avidin affinity reaction of biotinylated DNA and streptavidin-functionalized core/shell Fe(3)O(4)/Au magnetic nanoparticles, followed by a DNA hybridization reaction. Single-base coded CdTe nanoprobes (A-CdTe, T-CdTe, C-CdTe and G-CdTe, respectively) were used as the detection probes. The mutation site in DNA was distinguished by detecting the decreases of the resonance frequency of the piezoelectric quartz crystal when the coded nanoprobe was added to the test system. This proposed detection strategy for point mutation in DNA is proved to be sensitive, simple, repeatable and low-cost, consequently, it has a great potential for single nucleotide polymorphism (SNP) detection. 2011 © The Japan Society for Analytical Chemistry
NASA Technical Reports Server (NTRS)
Beers, B. L.; Pine, V. W.; Hwang, H. C.; Bloomberg, H. W.; Lin, D. L.; Schmidt, M. J.; Strickland, D. J.
1979-01-01
The model consists of four phases: single electron dynamics, single electron avalanche, negative streamer development, and tree formation. Numerical algorithms and computer code implementations are presented for the first three phases. An approach to developing a code description of fourth phase is discussed. Numerical results are presented for a crude material model of Teflon.
USDA-ARS?s Scientific Manuscript database
Single-nucleotide Polymorphism (SNP) markers are by far the most common form of DNA polymorphism in a genome. The objectives of this study were to discover SNPs in common bean comparing sequences from coding and non-coding regions obtained from Genbank and genomic DNA and to compare sequencing resu...
Code of Ethics for Electrical Engineers
NASA Astrophysics Data System (ADS)
Matsuki, Junya
The Institute of Electrical Engineers of Japan (IEEJ) has established the rules of practice for its members recently, based on its code of ethics enacted in 1998. In this paper, first, the characteristics of the IEEJ 1998 ethical code are explained in detail compared to the other ethical codes for other fields of engineering. Secondly, the contents which shall be included in the modern code of ethics for electrical engineers are discussed. Thirdly, the newly-established rules of practice and the modified code of ethics are presented. Finally, results of questionnaires on the new ethical code and rules which were answered on May 23, 2007, by 51 electrical and electronic students of the University of Fukui are shown.
Application of grammar-based codes for lossless compression of digital mammograms
NASA Astrophysics Data System (ADS)
Li, Xiaoli; Krishnan, Srithar; Ma, Ngok-Wah
2006-01-01
A newly developed grammar-based lossless source coding theory and its implementation was proposed in 1999 and 2000, respectively, by Yang and Kieffer. The code first transforms the original data sequence into an irreducible context-free grammar, which is then compressed using arithmetic coding. In the study of grammar-based coding for mammography applications, we encountered two issues: processing time and limited number of single-character grammar G variables. For the first issue, we discover a feature that can simplify the matching subsequence search in the irreducible grammar transform process. Using this discovery, an extended grammar code technique is proposed and the processing time of the grammar code can be significantly reduced. For the second issue, we propose to use double-character symbols to increase the number of grammar variables. Under the condition that all the G variables have the same probability of being used, our analysis shows that the double- and single-character approaches have the same compression rates. By using the methods proposed, we show that the grammar code can outperform three other schemes: Lempel-Ziv-Welch (LZW), arithmetic, and Huffman on compression ratio, and has similar error tolerance capabilities as LZW coding under similar circumstances.
Wang, Nuohan; Ma, Jianjiang; Pei, Wenfeng; Wu, Man; Li, Haijing; Li, Xingli; Yu, Shuxun; Zhang, Jinfa; Yu, Jiwen
2017-03-01
Lysophosphatidic acid acyltransferase (LPAAT) encoded by a multigene family is a rate-limiting enzyme in the Kennedy pathway in higher plants. Cotton is the most important natural fiber crop and one of the most important oilseed crops. However, little is known on genes coding for LPAATs involved in oil biosynthesis with regard to its genome organization, diversity, expression, natural genetic variation, and association with fiber development and oil content in cotton. In this study, a comprehensive genome-wide analysis in four Gossypium species with genome sequences, i.e., tetraploid G. hirsutum- AD 1 and G. barbadense- AD 2 and its possible ancestral diploids G. raimondii- D 5 and G. arboreum- A 2 , identified 13, 10, 8, and 9 LPAAT genes, respectively, that were divided into four subfamilies. RNA-seq analyses of the LPAAT genes in the widely grown G. hirsutum suggest their differential expression at the transcriptional level in developing cottonseeds and fibers. Although 10 LPAAT genes were co-localised with quantitative trait loci (QTL) for cottonseed oil or protein content within a 25-cM region, only one single strand conformation polymorphic (SSCP) marker developed from a synonymous single nucleotide polymorphism (SNP) of the At-Gh13LPAAT5 gene was significantly correlated with cottonseed oil and protein contents in one of the three field tests. Moreover, transformed yeasts using the At-Gh13LPAAT5 gene with the two sequences for the SNP led to similar results, i.e., a 25-31% increase in palmitic acid and oleic acid, and a 16-29% increase in total triacylglycerol (TAG). The results in this study demonstrated that the natural variation in the LPAAT genes to improving cottonseed oil content and fiber quality is limited; therefore, traditional cross breeding should not expect much progress in improving cottonseed oil content or fiber quality through a marker-assisted selection for the LPAAT genes. However, enhancing the expression of one of the LPAAT genes such as At-Gh13LPAAT5 can significantly increase the production of total TAG and other fatty acids, providing an incentive for further studies into the use of LPAAT genes to increase cottonseed oil content through biotechnology.
Variable presence of the inverted repeat and plastome stability in Erodium
Blazier, John C.; Jansen, Robert K.; Mower, Jeffrey P.; Govindu, Madhu; Zhang, Jin; Weng, Mao-Lun; Ruhlman, Tracey A.
2016-01-01
Background and Aims Several unrelated lineages such as plastids, viruses and plasmids, have converged on quadripartite genomes of similar size with large and small single copy regions and a large inverted repeat (IR). Except for Erodium (Geraniaceae), saguaro cactus and some legumes, the plastomes of all photosynthetic angiosperms display this structure. The functional significance of the IR is not understood and Erodium provides a system to examine the role of the IR in the long-term stability of these genomes. We compared the degree of genomic rearrangement in plastomes of Erodium that differ in the presence and absence of the IR. Methods We sequenced 17 new Erodium plastomes. Using 454, Illumina, PacBio and Sanger sequences, 16 genomes were assembled and categorized along with one incomplete and two previously published Erodium plastomes. We conducted phylogenetic analyses among these species using a dataset of 19 protein-coding genes and determined if significantly higher evolutionary rates had caused the long branch seen previously in phylogenetic reconstructions within the genus. Bioinformatic comparisons were also performed to evaluate plastome evolution across the genus. Key Results Erodium plastomes fell into four types (Type 1–4) that differ in their substitution rates, short dispersed repeat content and degree of genomic rearrangement, gene and intron content and GC content. Type 4 plastomes had significantly higher rates of synonymous substitutions (dS) for all genes and for 14 of the 19 genes non-synonymous substitutions (dN) were significantly accelerated. We evaluated the evidence for a single IR loss in Erodium and in doing so discovered that Type 4 plastomes contain a novel IR. Conclusions The presence or absence of the IR does not affect plastome stability in Erodium. Rather, the overall repeat content shows a negative correlation with genome stability, a pattern in agreement with other angiosperm groups and recent findings on genome stability in bacterial endosymbionts. PMID:27192713
Variable presence of the inverted repeat and plastome stability in Erodium.
Blazier, John C; Jansen, Robert K; Mower, Jeffrey P; Govindu, Madhu; Zhang, Jin; Weng, Mao-Lun; Ruhlman, Tracey A
2016-06-01
Several unrelated lineages such as plastids, viruses and plasmids, have converged on quadripartite genomes of similar size with large and small single copy regions and a large inverted repeat (IR). Except for Erodium (Geraniaceae), saguaro cactus and some legumes, the plastomes of all photosynthetic angiosperms display this structure. The functional significance of the IR is not understood and Erodium provides a system to examine the role of the IR in the long-term stability of these genomes. We compared the degree of genomic rearrangement in plastomes of Erodium that differ in the presence and absence of the IR. We sequenced 17 new Erodium plastomes. Using 454, Illumina, PacBio and Sanger sequences, 16 genomes were assembled and categorized along with one incomplete and two previously published Erodium plastomes. We conducted phylogenetic analyses among these species using a dataset of 19 protein-coding genes and determined if significantly higher evolutionary rates had caused the long branch seen previously in phylogenetic reconstructions within the genus. Bioinformatic comparisons were also performed to evaluate plastome evolution across the genus. Erodium plastomes fell into four types (Type 1-4) that differ in their substitution rates, short dispersed repeat content and degree of genomic rearrangement, gene and intron content and GC content. Type 4 plastomes had significantly higher rates of synonymous substitutions (dS) for all genes and for 14 of the 19 genes non-synonymous substitutions (dN) were significantly accelerated. We evaluated the evidence for a single IR loss in Erodium and in doing so discovered that Type 4 plastomes contain a novel IR. The presence or absence of the IR does not affect plastome stability in Erodium. Rather, the overall repeat content shows a negative correlation with genome stability, a pattern in agreement with other angiosperm groups and recent findings on genome stability in bacterial endosymbionts. © The Author 2016. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Optimizing fusion PIC code performance at scale on Cori Phase 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koskela, T. S.; Deslippe, J.
In this paper we present the results of optimizing the performance of the gyrokinetic full-f fusion PIC code XGC1 on the Cori Phase Two Knights Landing system. The code has undergone substantial development to enable the use of vector instructions in its most expensive kernels within the NERSC Exascale Science Applications Program. We study the single-node performance of the code on an absolute scale using the roofline methodology to guide optimization efforts. We have obtained 2x speedups in single node performance due to enabling vectorization and performing memory layout optimizations. On multiple nodes, the code is shown to scale wellmore » up to 4000 nodes, near half the size of the machine. We discuss some communication bottlenecks that were identified and resolved during the work.« less
Phillips, Yvonne F; Towsey, Michael; Roe, Paul
2018-01-01
Audio recordings of the environment are an increasingly important technique to monitor biodiversity and ecosystem function. While the acquisition of long-duration recordings is becoming easier and cheaper, the analysis and interpretation of that audio remains a significant research area. The issue addressed in this paper is the automated reduction of environmental audio data to facilitate ecological investigations. We describe a method that first reduces environmental audio to vectors of acoustic indices, which are then clustered. This can reduce the audio data by six to eight orders of magnitude yet retain useful ecological information. We describe techniques to visualise sequences of cluster occurrence (using for example, diel plots, rose plots) that assist interpretation of environmental audio. Colour coding acoustic clusters allows months and years of audio data to be visualised in a single image. These techniques are useful in identifying and indexing the contents of long-duration audio recordings. They could also play an important role in monitoring long-term changes in species abundance brought about by habitat degradation and/or restoration.
Bustamante, Luis; Sáez, Vania; Hinrichsen, Patricio; Castro, María H; Vergara, Carola; von Baer, Dietrich; Mardones, Claudia
2017-04-05
A novel 'Red Globe' (RG)-derived grape variety, 'Pink Globe' (PG), was described and registered as a new genotype, with earlier ripening and sweeter taste than those of RG. Microsatellite analysis revealed that PG and RG are undifferentiable; however, the PG VvmybA1c contains six single-nucleotide polymorphisms within the coding and noncoding region, possibly related to the reduced VvmybA1 expression levels. Conversely, HPLC-DAD-ESI-MS/MS analysis showed significantly lower anthocyanin content in PG skin than in RG skin, and PG had no detectable trihydroxylated anthocyanins. Total flavonols did not differ between the variants, although some quercetin derivate concentrations were lower in PG. HPLC-FLD analysis revealed slightly higher concentrations of epicatechin and a procyanidin dimer in PG seeds, although the antioxidant capacity of crude extracts from either variety did not differ significantly. These differences, particularly in monomeric anthocyanin content, can be attributed to altered activity of a MYB-type transcription factor, reducing Vvufgt expression.
Towsey, Michael; Roe, Paul
2018-01-01
Audio recordings of the environment are an increasingly important technique to monitor biodiversity and ecosystem function. While the acquisition of long-duration recordings is becoming easier and cheaper, the analysis and interpretation of that audio remains a significant research area. The issue addressed in this paper is the automated reduction of environmental audio data to facilitate ecological investigations. We describe a method that first reduces environmental audio to vectors of acoustic indices, which are then clustered. This can reduce the audio data by six to eight orders of magnitude yet retain useful ecological information. We describe techniques to visualise sequences of cluster occurrence (using for example, diel plots, rose plots) that assist interpretation of environmental audio. Colour coding acoustic clusters allows months and years of audio data to be visualised in a single image. These techniques are useful in identifying and indexing the contents of long-duration audio recordings. They could also play an important role in monitoring long-term changes in species abundance brought about by habitat degradation and/or restoration. PMID:29494629
Reliability in Cross-National Content Analysis.
ERIC Educational Resources Information Center
Peter, Jochen; Lauf, Edmund
2002-01-01
Investigates how coder characteristics such as language skills, political knowledge, coding experience, and coding certainty affected inter-coder and coder-training reliability. Shows that language skills influenced both reliability types. Suggests that cross-national researchers should pay more attention to cross-national assessments of…
Code of Federal Regulations, 2010 CFR
... 49 U.S.C. United States Code, 2009 Edition Title 49 - TRANSPORTATION SUBTITLE V - RAIL PROGRAMS PART B - ASSISTANCE CHAPTER 227 - STATE RAIL PLANS Sec. 22705 - Content §22705. Content (a) In General .—Each State rail plan shall, at a minimum, contain the following: (1) An inventory of the existing overall rail transportation system an...
Code of Federal Regulations, 2010 CFR
... 49 U.S.C. United States Code, 2011 Edition Title 49 - TRANSPORTATION SUBTITLE V - RAIL PROGRAMS PART B - ASSISTANCE CHAPTER 227 - STATE RAIL PLANS Sec. 22705 - Content §22705. Content (a) In General .—Each State rail plan shall, at a minimum, contain the following: (1) An inventory of the existing overall rail transportation system an...
Code of Federal Regulations, 2010 CFR
... 49 U.S.C. United States Code, 2014 Edition Title 49 - TRANSPORTATION SUBTITLE V - RAIL PROGRAMS PART B - ASSISTANCE CHAPTER 227 - STATE RAIL PLANS Sec. 22705 - Content §22705. Content (a) In General .—Each State rail plan shall, at a minimum, contain the following: (1) An inventory of the existing overall rail transportation system an...
Michalaki, M; Oulis, C J; Pandis, N; Eliades, G
2016-12-01
This in vitro study was to classify questionable for caries occlusal surfaces (QCOS) of permanent teeth according to ICDAS codes 1, 2, and 3 and to compare them in terms of enamel mineral composition with the areas of sound tissue of the same tooth. Partially impacted human molars (60) extracted for therapeutic reasons with QCOS were used in the study, photographed via a polarised light microscope and classified according to the ICDAS II (into codes 1, 2, or 3). The crowns were embedded in clear self-cured acrylic resin and longitudinally sectioned at the levels of the characterised lesions and studied by SEM/EDX, to assess enamel mineral composition of the QCOS. Univariate and multivariate random effect regressions were used for Ca (wt%), P (wt%), and Ca/P (wt%). The EDX analysis indicated changes in the Ca and P contents that were more prominent in ICDAS-II code 3 lesions compared to codes 1 and 2 lesions. In these lesions, Ca (wt%) and P (wt%) concentrations were significantly decreased (p = 0.01) in comparison with sound areas. Ca and P (wt%) contents were significantly lower (p = 0.02 and p = 0.01 respectively) for code 3 areas in comparison with codes 1 and 2 areas. Significantly higher (p = 0.01) Ca (wt%) and P (wt%) contents were found on sound areas compared to the lesion areas. The enamel of occlusal surfaces of permanent teeth with ICDAS 1, 2, and 3 lesions was found to have different Ca/P compositions, necessitating further investigation on whether these altered surfaces might behave differently on etching preparation before fissure sealant placement, compared to sound surfaces.
Louder than words: power and conflict in interprofessional education articles, 1954–2013
Paradis, Elise; Whitehead, Cynthia R
2015-01-01
Context Interprofessional education (IPE) aspires to enable collaborative practice. Current IPE offerings, although rapidly proliferating, lack evidence of efficacy and theoretical grounding. Objectives Our research aimed to explore the historical emergence of the field of IPE and to analyse the positioning of this academic field of inquiry. In particular, we sought to investigate the extent to which power and conflict – elements central to interprofessional care – figure in the IPE literature. Methods We used a combination of deductive and inductive automated coding and manual coding to explore the contents of 2191 articles in the IPE literature published between 1954 and 2013. Inductive coding focused on the presence and use of the sociological (rather than statistical) version of power, which refers to hierarchies and asymmetries among the professions. Articles found to be centrally about power were then analysed using content analysis. Results Publications on IPE have grown exponentially in the past decade. Deductive coding of identified articles showed an emphasis on students, learning, programmes and practice. Automated inductive coding of titles and abstracts identified 129 articles potentially about power, but manual coding found that only six articles put power and conflict at the centre. Content analysis of these six articles revealed that two provided tentative explorations of power dynamics, one skirted around this issue, and three explicitly theorised and integrated power and conflict. Conclusions The lack of attention to power and conflict in the IPE literature suggests that many educators do not foreground these issues. Education programmes are expected to transform individuals into effective collaborators, without heed to structural, organisational and institutional factors. In so doing, current constructions of IPE veil the problems that IPE attempts to solve. PMID:25800300
Coding Local and Global Binary Visual Features Extracted From Video Sequences.
Baroffio, Luca; Canclini, Antonio; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano
2015-11-01
Binary local features represent an effective alternative to real-valued descriptors, leading to comparable results for many visual analysis tasks while being characterized by significantly lower computational complexity and memory requirements. When dealing with large collections, a more compact representation based on global features is often preferred, which can be obtained from local features by means of, e.g., the bag-of-visual word model. Several applications, including, for example, visual sensor networks and mobile augmented reality, require visual features to be transmitted over a bandwidth-limited network, thus calling for coding techniques that aim at reducing the required bit budget while attaining a target level of efficiency. In this paper, we investigate a coding scheme tailored to both local and global binary features, which aims at exploiting both spatial and temporal redundancy by means of intra- and inter-frame coding. In this respect, the proposed coding scheme can conveniently be adopted to support the analyze-then-compress (ATC) paradigm. That is, visual features are extracted from the acquired content, encoded at remote nodes, and finally transmitted to a central controller that performs the visual analysis. This is in contrast with the traditional approach, in which visual content is acquired at a node, compressed and then sent to a central unit for further processing, according to the compress-then-analyze (CTA) paradigm. In this paper, we experimentally compare the ATC and the CTA by means of rate-efficiency curves in the context of two different visual analysis tasks: 1) homography estimation and 2) content-based retrieval. Our results show that the novel ATC paradigm based on the proposed coding primitives can be competitive with the CTA, especially in bandwidth limited scenarios.
Coding Local and Global Binary Visual Features Extracted From Video Sequences
NASA Astrophysics Data System (ADS)
Baroffio, Luca; Canclini, Antonio; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano
2015-11-01
Binary local features represent an effective alternative to real-valued descriptors, leading to comparable results for many visual analysis tasks, while being characterized by significantly lower computational complexity and memory requirements. When dealing with large collections, a more compact representation based on global features is often preferred, which can be obtained from local features by means of, e.g., the Bag-of-Visual-Word (BoVW) model. Several applications, including for example visual sensor networks and mobile augmented reality, require visual features to be transmitted over a bandwidth-limited network, thus calling for coding techniques that aim at reducing the required bit budget, while attaining a target level of efficiency. In this paper we investigate a coding scheme tailored to both local and global binary features, which aims at exploiting both spatial and temporal redundancy by means of intra- and inter-frame coding. In this respect, the proposed coding scheme can be conveniently adopted to support the Analyze-Then-Compress (ATC) paradigm. That is, visual features are extracted from the acquired content, encoded at remote nodes, and finally transmitted to a central controller that performs visual analysis. This is in contrast with the traditional approach, in which visual content is acquired at a node, compressed and then sent to a central unit for further processing, according to the Compress-Then-Analyze (CTA) paradigm. In this paper we experimentally compare ATC and CTA by means of rate-efficiency curves in the context of two different visual analysis tasks: homography estimation and content-based retrieval. Our results show that the novel ATC paradigm based on the proposed coding primitives can be competitive with CTA, especially in bandwidth limited scenarios.
Louder than words: power and conflict in interprofessional education articles, 1954-2013.
Paradis, Elise; Whitehead, Cynthia R
2015-04-01
Interprofessional education (IPE) aspires to enable collaborative practice. Current IPE offerings, although rapidly proliferating, lack evidence of efficacy and theoretical grounding. Our research aimed to explore the historical emergence of the field of IPE and to analyse the positioning of this academic field of inquiry. In particular, we sought to investigate the extent to which power and conflict - elements central to interprofessional care - figure in the IPE literature. We used a combination of deductive and inductive automated coding and manual coding to explore the contents of 2191 articles in the IPE literature published between 1954 and 2013. Inductive coding focused on the presence and use of the sociological (rather than statistical) version of power, which refers to hierarchies and asymmetries among the professions. Articles found to be centrally about power were then analysed using content analysis. Publications on IPE have grown exponentially in the past decade. Deductive coding of identified articles showed an emphasis on students, learning, programmes and practice. Automated inductive coding of titles and abstracts identified 129 articles potentially about power, but manual coding found that only six articles put power and conflict at the centre. Content analysis of these six articles revealed that two provided tentative explorations of power dynamics, one skirted around this issue, and three explicitly theorised and integrated power and conflict. The lack of attention to power and conflict in the IPE literature suggests that many educators do not foreground these issues. Education programmes are expected to transform individuals into effective collaborators, without heed to structural, organisational and institutional factors. In so doing, current constructions of IPE veil the problems that IPE attempts to solve. © 2015 The Authors Medical Education Published by John Wiley & Sons Ltd.
Genomics dataset of unidentified disclosed isolates.
Rekadwad, Bhagwan N
2016-09-01
Analysis of DNA sequences is necessary for higher hierarchical classification of the organisms. It gives clues about the characteristics of organisms and their taxonomic position. This dataset is chosen to find complexities in the unidentified DNA in the disclosed patents. A total of 17 unidentified DNA sequences were thoroughly analyzed. The quick response codes were generated. AT/GC content of the DNA sequences analysis was carried out. The QR is helpful for quick identification of isolates. AT/GC content is helpful for studying their stability at different temperatures. Additionally, a dataset on cleavage code and enzyme code studied under the restriction digestion study, which helpful for performing studies using short DNA sequences was reported. The dataset disclosed here is the new revelatory data for exploration of unique DNA sequences for evaluation, identification, comparison and analysis.
Exact solutions for rate and synchrony in recurrent networks of coincidence detectors.
Mikula, Shawn; Niebur, Ernst
2008-11-01
We provide analytical solutions for mean firing rates and cross-correlations of coincidence detector neurons in recurrent networks with excitatory or inhibitory connectivity, with rate-modulated steady-state spiking inputs. We use discrete-time finite-state Markov chains to represent network state transition probabilities, which are subsequently used to derive exact analytical solutions for mean firing rates and cross-correlations. As illustrated in several examples, the method can be used for modeling cortical microcircuits and clarifying single-neuron and population coding mechanisms. We also demonstrate that increasing firing rates do not necessarily translate into increasing cross-correlations, though our results do support the contention that firing rates and cross-correlations are likely to be coupled. Our analytical solutions underscore the complexity of the relationship between firing rates and cross-correlations.
Uranus' cloud structure and scattering particle properties from IRTF SpeX observations
NASA Astrophysics Data System (ADS)
Tice, D. S.; Irwin, P. G. J.; Fletcher, L. N.; Teanby, N. A.; Orton, G. S.; Davis, G. R.
2011-10-01
Observations of Uranus were made in August 2009 with the SpeX spectrograph at the NASA Infrared Telescope Facility (IRTF). Analysed spectra range from 0.8 to 1.8 μm at a spatial resolution of 0.5" and a spectral resolution of R = 1,200. Spectra from 0.818 to 0.834 μm, a region characterised by both strong hydrogen quadrupole and methane absorptions are considered to determine methane content. Evidence indicates that methane abundance varies with latitude. NEMESIS, an optimal estimation retrieval code with full-scattering capability, is employed to analyse the full range of data. Cloud and haze properties in the upper troposphere and stratosphere are characterised, and are consistent with other current literature. New information on single scattering albedos and particle size distributions are inferred.
Graphical user interfaces for symbol-oriented database visualization and interaction
NASA Astrophysics Data System (ADS)
Brinkschulte, Uwe; Siormanolakis, Marios; Vogelsang, Holger
1997-04-01
In this approach, two basic services designed for the engineering of computer based systems are combined: a symbol-oriented man-machine-service and a high speed database-service. The man-machine service is used to build graphical user interfaces (GUIs) for the database service; these interfaces are stored using the database service. The idea is to create a GUI-builder and a GUI-manager for the database service based upon the man-machine service using the concept of symbols. With user-definable and predefined symbols, database contents can be visualized and manipulated in a very flexible and intuitive way. Using the GUI-builder and GUI-manager, a user can build and operate its own graphical user interface for a given database according to its needs without writing a single line of code.
Hanford Facility Dangerous Waste Permit Application for T Plant Complex
DOE Office of Scientific and Technical Information (OSTI.GOV)
BARNES, B.M.
2002-09-01
The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to Part B permit application documentation submitted for individual, operating treatment, storage, and/or disposal units, such as the T Plant Complex (this document, DOE/RL-95-36). Both the General Information and Unit-Specific portions of the Hanford Facility Dangerous Waste Permit Application address the content of the Part B permit application guidance prepared by the Washington State Department of Ecology (Ecology 1996) and the U.S. Environmental Protection Agencymore » (40 Code of Federal Regulations 270), with additional information needs defined by the Hazardous and Solid Waste Amendments and revisions of Washington Administrative Code 173-303. For ease of reference, the Washington State Department of Ecology alpha-numeric section identifiers from the permit application guidance documentation (Ecology 1996) follow, in brackets, the chapter headings and subheadings. A checklist indicating where information is contained in the T Plant Complex permit application documentation, in relation to the Washington State Department of Ecology guidance, is located in the Contents Section. Documentation contained in the General Information Portion is broader in nature and could be used by multiple treatment, storage, and/or disposal units (e.g., the glossary provided in the General Information Portion). Wherever appropriate, the T Plant Complex permit application documentation makes cross-reference to the General Information Portion, rather than duplicating text.« less
Study on multiple-hops performance of MOOC sequences-based optical labels for OPS networks
NASA Astrophysics Data System (ADS)
Zhang, Chongfu; Qiu, Kun; Ma, Chunli
2009-11-01
In this paper, we utilize a new study method that is under independent case of multiple optical orthogonal codes to derive the probability function of MOOCS-OPS networks, discuss the performance characteristics for a variety of parameters, and compare some characteristics of the system employed by single optical orthogonal code or multiple optical orthogonal codes sequences-based optical labels. The performance of the system is also calculated, and our results verify that the method is effective. Additionally it is found that performance of MOOCS-OPS networks would, negatively, be worsened, compared with single optical orthogonal code-based optical label for optical packet switching (SOOC-OPS); however, MOOCS-OPS networks can greatly enlarge the scalability of optical packet switching networks.
Zhang, Fangzheng; Ge, Xiaozhong; Gao, Bindong; Pan, Shilong
2015-08-24
A novel scheme for photonic generation of a phase-coded microwave signal is proposed and its application in one-dimension distance measurement is demonstrated. The proposed signal generator has a simple and compact structure based on a single dual-polarization modulator. Besides, the generated phase-coded signal is stable and free from the DC and low-frequency backgrounds. An experiment is carried out. A 2 Gb/s phase-coded signal at 20 GHz is successfully generated, and the recovered phase information agrees well with the input 13-bit Barker code. To further investigate the performance of the proposed signal generator, its application in one-dimension distance measurement is demonstrated. The measurement accuracy is less than 1.7 centimeters within a measurement range of ~2 meters. The experimental results can verify the feasibility of the proposed phase-coded microwave signal generator and also provide strong evidence to support its practical applications.
Production Level CFD Code Acceleration for Hybrid Many-Core Architectures
NASA Technical Reports Server (NTRS)
Duffy, Austen C.; Hammond, Dana P.; Nielsen, Eric J.
2012-01-01
In this work, a novel graphics processing unit (GPU) distributed sharing model for hybrid many-core architectures is introduced and employed in the acceleration of a production-level computational fluid dynamics (CFD) code. The latest generation graphics hardware allows multiple processor cores to simultaneously share a single GPU through concurrent kernel execution. This feature has allowed the NASA FUN3D code to be accelerated in parallel with up to four processor cores sharing a single GPU. For codes to scale and fully use resources on these and the next generation machines, codes will need to employ some type of GPU sharing model, as presented in this work. Findings include the effects of GPU sharing on overall performance. A discussion of the inherent challenges that parallel unstructured CFD codes face in accelerator-based computing environments is included, with considerations for future generation architectures. This work was completed by the author in August 2010, and reflects the analysis and results of the time.
Towards a European code of medical ethics. Ethical and legal issues.
Patuzzo, Sara; Pulice, Elisabetta
2017-01-01
The feasibility of a common European code of medical ethics is discussed, with consideration and evaluation of the difficulties such a project is going to face, from both the legal and ethical points of view. On the one hand, the analysis will underline the limits of a common European code of medical ethics as an instrument for harmonising national professional rules in the European context; on the other hand, we will highlight some of the potentials of this project, which could be increased and strengthened through a proper rulemaking process and through adequate and careful choice of content. We will also stress specific elements and devices that should be taken into consideration during the establishment of the code, from both procedural and content perspectives. Regarding methodological issues, the limits and potentialities of a common European code of medical ethics will be analysed from an ethical point of view and then from a legal perspective. The aim of this paper is to clarify the framework for the potential but controversial role of the code in the European context, showing the difficulties in enforcing and harmonising national ethical rules into a European code of medical ethics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Wavelet-based reversible watermarking for authentication
NASA Astrophysics Data System (ADS)
Tian, Jun
2002-04-01
In the digital information age, digital content (audio, image, and video) can be easily copied, manipulated, and distributed. Copyright protection and content authentication of digital content has become an urgent problem to content owners and distributors. Digital watermarking has provided a valuable solution to this problem. Based on its application scenario, most digital watermarking methods can be divided into two categories: robust watermarking and fragile watermarking. As a special subset of fragile watermark, reversible watermark (which is also called lossless watermark, invertible watermark, erasable watermark) enables the recovery of the original, unwatermarked content after the watermarked content has been detected to be authentic. Such reversibility to get back unwatermarked content is highly desired in sensitive imagery, such as military data and medical data. In this paper we present a reversible watermarking method based on an integer wavelet transform. We look into the binary representation of each wavelet coefficient and embed an extra bit to expandable wavelet coefficient. The location map of all expanded coefficients will be coded by JBIG2 compression and these coefficient values will be losslessly compressed by arithmetic coding. Besides these two compressed bit streams, an SHA-256 hash of the original image will also be embedded for authentication purpose.
Sanctions Connected to Dress Code Violations in Secondary School Handbooks
ERIC Educational Resources Information Center
Workman, Jane E.; Freeburg, Elizabeth W.; Lentz-Hees, Elizabeth S.
2004-01-01
This study identifies and evaluates sanctions for dress code violations in secondary school handbooks. Sanctions, or consequences for breaking rules, vary along seven interrelated dimensions: source, formality, retribution, obtrusiveness, magnitude, severity, and pervasiveness. A content analysis of handbooks from 155 public secondary schools…
National Geocoding Converter File 1 : Volume 1. Structure & Content.
DOT National Transportation Integrated Search
1974-01-01
This file contains a record for each county, county equivalent (as defined by the Census Bureau), SMSA county segment and SPLC county segment in the U.S. A record identifies for an area all major county codes and the associated county aggregate codes
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Coding. 106.90 Section 106.90 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION INFANT FORMULA QUALITY CONTROL PROCEDURES Quality Control Procedures for Assuring Nutrient Content...
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Coding. 106.90 Section 106.90 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION INFANT FORMULA QUALITY CONTROL PROCEDURES Quality Control Procedures for Assuring Nutrient Content...
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Coding. 106.90 Section 106.90 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION INFANT FORMULA QUALITY CONTROL PROCEDURES Quality Control Procedures for Assuring Nutrient Content...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dritz, K.W.; Boyle, J.M.
This paper addresses the problem of measuring and analyzing the performance of fine-grained parallel programs running on shared-memory multiprocessors. Such processors use locking (either directly in the application program, or indirectly in a subroutine library or the operating system) to serialize accesses to global variables. Given sufficiently high rates of locking, the chief factor preventing linear speedup (besides lack of adequate inherent parallelism in the application) is lock contention - the blocking of processes that are trying to acquire a lock currently held by another process. We show how a high-resolution, low-overhead clock may be used to measure both lockmore » contention and lack of parallel work. Several ways of presenting the results are covered, culminating in a method for calculating, in a single multiprocessing run, both the speedup actually achieved and the speedup lost to contention for each lock and to lack of parallel work. The speedup losses are reported in the same units, ''processor-equivalents,'' as the speedup achieved. Both are obtained without having to perform the usual one-process comparison run. We chronicle also a variety of experiments motivated by actual results obtained with our measurement method. The insights into program performance that we gained from these experiments helped us to refine the parts of our programs concerned with communication and synchronization. Ultimately these improvements reduced lock contention to a negligible amount and yielded nearly linear speedup in applications not limited by lack of parallel work. We describe two generally applicable strategies (''code motion out of critical regions'' and ''critical-region fissioning'') for reducing lock contention and one (''lock/variable fusion'') applicable only on certain architectures.« less
Energy drinks available in Ireland: a description of caffeine and sugar content.
Keaver, Laura; Gilpin, Susannah; Fernandes da Silva, Joana Caldeira; Buckley, Claire; Foley-Nolan, Cliodhna
2017-06-01
To describe the caffeine and sugar content of all energy drinks available on the island of Ireland. Two retail outlets were selected from each of: multinational, convenience and discount stores in Northern Ireland and the Republic of Ireland, and all available single-serve energy drinks were purchased. The cross-sectional survey was conducted in February 2015 and brand name, price, volume, caffeine and sugar content were recorded for each product. Descriptive analysis was performed. Seventy-eight products were identified on the island of Ireland (regular, n 59; diet/sugar-free/light, n 19). Caffeine and sugar content was in the range of 14-35 mg and 2·9-15·6 g per 100 ml, respectively. Mean caffeine content of 102·2 mg per serving represents 25·6 % of the maximum intake advised for adults by the European Food Safety Authority. Per serving, mean sugar content of regular energy drinks was 37 g. This exceeds WHO recommendations for maximum daily sugar intake of <5 % of total energy intake (25 g for adults consuming 8368 kJ (2000 kcal) diet). If displaying front-of-pack labelling, fifty-seven of the fifty-nine regular energy drinks would receive a Food Standards Agency 'red' colour-coded label for sugar. Energy drinks are freely available on the island of Ireland and all products surveyed can be defined as highly caffeinated products. This has potential health issues particularly for children and adolescents where safe limits of caffeine have not been determined. Energy drinks surveyed also contained high levels of sugar and could potentially contribute to weight gain and adverse dental health effects.
Analysis of hybrid subcarrier multiplexing of OCDMA based on single photodiode detection
NASA Astrophysics Data System (ADS)
Ahmad, N. A. A.; Junita, M. N.; Aljunid, S. A.; Rashidi, C. B. M.; Endut, R.
2017-11-01
This paper analyzes the performance of subcarrier multiplexing (SCM) of spectral amplitude coding optical code multiple access (SAC-OCDMA) by applying Recursive Combinatorial (RC) code based on single photodiode detection (SPD). SPD is used in the receiver part to reduce the effect of multiple access interference (MAI) which contributes as a dominant noise in incoherent SAC-OCDMA systems. Results indicate that the SCM OCDMA network performance could be improved by using lower data rates and higher number of weight. Total number of users can also be enhanced by adding lower data rates and higher number of subcarriers.
Code inspection instructional validation
NASA Technical Reports Server (NTRS)
Orr, Kay; Stancil, Shirley
1992-01-01
The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option will produce the most effective results.
Gigaflop performance on a CRAY-2: Multitasking a computational fluid dynamics application
NASA Technical Reports Server (NTRS)
Tennille, Geoffrey M.; Overman, Andrea L.; Lambiotte, Jules J.; Streett, Craig L.
1991-01-01
The methodology is described for converting a large, long-running applications code that executed on a single processor of a CRAY-2 supercomputer to a version that executed efficiently on multiple processors. Although the conversion of every application is different, a discussion of the types of modification used to achieve gigaflop performance is included to assist others in the parallelization of applications for CRAY computers, especially those that were developed for other computers. An existing application, from the discipline of computational fluid dynamics, that had utilized over 2000 hrs of CPU time on CRAY-2 during the previous year was chosen as a test case to study the effectiveness of multitasking on a CRAY-2. The nature of dominant calculations within the application indicated that a sustained computational rate of 1 billion floating-point operations per second, or 1 gigaflop, might be achieved. The code was first analyzed and modified for optimal performance on a single processor in a batch environment. After optimal performance on a single CPU was achieved, the code was modified to use multiple processors in a dedicated environment. The results of these two efforts were merged into a single code that had a sustained computational rate of over 1 gigaflop on a CRAY-2. Timings and analysis of performance are given for both single- and multiple-processor runs.
Kim, Kyung Lock; Park, Kyeng Min; Murray, James; Kim, Kimoon; Ryu, Sung Ho
2018-05-23
Combinatorial post-translational modifications (PTMs), which can serve as dynamic "molecular barcodes", have been proposed to regulate distinct protein functions. However, studies of combinatorial PTMs on single protein molecules have been hindered by a lack of suitable analytical methods. Here, we describe erasable single-molecule blotting (eSiMBlot) for combinatorial PTM profiling. This assay is performed in a highly multiplexed manner and leverages the benefits of covalent protein immobilization, cyclic probing with different antibodies, and single molecule fluorescence imaging. Especially, facile and efficient covalent immobilization on a surface using Cu-free click chemistry permits multiple rounds (>10) of antibody erasing/reprobing without loss of antigenicity. Moreover, cumulative detection of coregistered multiple data sets for immobilized single-epitope molecules, such as HA peptide, can be used to increase the antibody detection rate. Finally, eSiMBlot enables direct visualization and quantitative profiling of combinatorial PTM codes at the single-molecule level, as we demonstrate by revealing the novel phospho-codes of ligand-induced epidermal growth factor receptor. Thus, eSiMBlot provides an unprecedentedly simple, rapid, and versatile platform for analyzing the vast number of combinatorial PTMs in biological pathways.
2018-01-01
Combinatorial post-translational modifications (PTMs), which can serve as dynamic “molecular barcodes”, have been proposed to regulate distinct protein functions. However, studies of combinatorial PTMs on single protein molecules have been hindered by a lack of suitable analytical methods. Here, we describe erasable single-molecule blotting (eSiMBlot) for combinatorial PTM profiling. This assay is performed in a highly multiplexed manner and leverages the benefits of covalent protein immobilization, cyclic probing with different antibodies, and single molecule fluorescence imaging. Especially, facile and efficient covalent immobilization on a surface using Cu-free click chemistry permits multiple rounds (>10) of antibody erasing/reprobing without loss of antigenicity. Moreover, cumulative detection of coregistered multiple data sets for immobilized single-epitope molecules, such as HA peptide, can be used to increase the antibody detection rate. Finally, eSiMBlot enables direct visualization and quantitative profiling of combinatorial PTM codes at the single-molecule level, as we demonstrate by revealing the novel phospho-codes of ligand-induced epidermal growth factor receptor. Thus, eSiMBlot provides an unprecedentedly simple, rapid, and versatile platform for analyzing the vast number of combinatorial PTMs in biological pathways.
Digitising legacy zoological taxonomic literature: Processes, products and using the output
Lyal, Christopher H. C.
2016-01-01
Abstract By digitising legacy taxonomic literature using XML mark-up the contents become accessible to other taxonomic and nomenclatural information systems. Appropriate schemas need to be interoperable with other sectorial schemas, atomise to appropriate content elements and carry appropriate metadata to, for example, enable algorithmic assessment of availability of a name under the Code. Legacy (and new) literature delivered in this fashion will become part of a global taxonomic resource from which users can extract tailored content to meet their particular needs, be they nomenclatural, taxonomic, faunistic or other. To date, most digitisation of taxonomic literature has led to a more or less simple digital copy of a paper original – the output of the many efforts has effectively been an electronic copy of a traditional library. While this has increased accessibility of publications through internet access, the means by which many scientific papers are indexed and located is much the same as with traditional libraries. OCR and born-digital papers allow use of web search engines to locate instances of taxon names and other terms, but OCR efficiency in recognising taxonomic names is still relatively poor, people’s ability to use search engines effectively is mixed, and many papers cannot be searched directly. Instead of building digital analogues of traditional publications, we should consider what properties we require of future taxonomic information access. Ideally the content of each new digital publication should be accessible in the context of all previous published data, and the user able to retrieve nomenclatural, taxonomic and other data / information in the form required without having to scan all of the original papers and extract target content manually. This opens the door to dynamic linking of new content with extant systems: automatic population and updating of taxonomic catalogues, ZooBank and faunal lists, all descriptions of a taxon and its children instantly accessible with a single search, comparison of classifications used in different publications, and so on. A means to do this is through marking up content into XML, and the more atomised the mark-up the greater the possibilities for data retrieval and integration. Mark-up requires XML that accommodates the required content elements and is interoperable with other XML schemas, and there are now several written to do this, particularly TaxPub, taxonX and taXMLit, the last of these being the most atomised. We now need to automate this process as far as possible. Manual and automatic data and information retrieval is demonstrated by projects such as INOTAXA and Plazi. As we move to creating and using taxonomic products through the power of the internet, we need to ensure the output, while satisfying in its production the requirements of the Code, is fit for purpose in the future. PMID:26877659
Atkinson, Mark J; Lohs, Jan; Kuhagen, Ilka; Kaufman, Julie; Bhaidani, Shamsu
2006-01-01
Objectives This proof of concept (POC) study was designed to evaluate the use of an Internet-based bulletin board technology to aid parallel cross-cultural development of thematic content for a new set of patient-reported outcome measures (PROs). Methods The POC study, conducted in Germany and the United States, utilized Internet Focus Groups (IFGs) to assure the validity of new PRO items across the two cultures – all items were designed to assess the impact of excess facial oil on individuals' lives. The on-line IFG activities were modeled after traditional face-to-face focus groups and organized by a common 'Topic' Guide designed with input from thought leaders in dermatology and health outcomes research. The two sets of IFGs were professionally moderated in the native language of each country. IFG moderators coded the thematic content of transcripts, and a frequency analysis of code endorsement was used to identify areas of content similarity and difference between the two countries. Based on this information, draft PRO items were designed and a majority (80%) of the original participants returned to rate the relative importance of the newly designed questions. Findings The use of parallel cross-cultural content analysis of IFG transcripts permitted identification of the major content themes in each country as well as exploration of the possible reasons for any observed differences between the countries. Results from coded frequency counts and transcript reviews informed the design and wording of the test questions for the future PRO instrument(s). Subsequent ratings of item importance also deepened our understanding of potential areas of cross-cultural difference, differences that would be explored over the course of future validation studies involving these PROs. Conclusion The use of IFGs for cross-cultural content development received positive reviews from participants and was found to be both cost and time effective. The novel thematic coding methodology provided an empirical platform on which to develop culturally sensitive questionnaire content using the natural language of participants. Overall, the IFG responses and thematic analyses provided a thorough evaluation of similarities and differences in cross-cultural themes, which in turn acted as a sound base for the development of new PRO questionnaires. PMID:16995935
Atkinson, Mark J; Lohs, Jan; Kuhagen, Ilka; Kaufman, Julie; Bhaidani, Shamsu
2006-09-22
This proof of concept (POC) study was designed to evaluate the use of an Internet-based bulletin board technology to aid parallel cross-cultural development of thematic content for a new set of patient-reported outcome measures (PROs). The POC study, conducted in Germany and the United States, utilized Internet Focus Groups (IFGs) to assure the validity of new PRO items across the two cultures--all items were designed to assess the impact of excess facial oil on individuals' lives. The on-line IFG activities were modeled after traditional face-to-face focus groups and organized by a common 'Topic' Guide designed with input from thought leaders in dermatology and health outcomes research. The two sets of IFGs were professionally moderated in the native language of each country. IFG moderators coded the thematic content of transcripts, and a frequency analysis of code endorsement was used to identify areas of content similarity and difference between the two countries. Based on this information, draft PRO items were designed and a majority (80%) of the original participants returned to rate the relative importance of the newly designed questions. The use of parallel cross-cultural content analysis of IFG transcripts permitted identification of the major content themes in each country as well as exploration of the possible reasons for any observed differences between the countries. Results from coded frequency counts and transcript reviews informed the design and wording of the test questions for the future PRO instrument(s). Subsequent ratings of item importance also deepened our understanding of potential areas of cross-cultural difference, differences that would be explored over the course of future validation studies involving these PROs. The use of IFGs for cross-cultural content development received positive reviews from participants and was found to be both cost and time effective. The novel thematic coding methodology provided an empirical platform on which to develop culturally sensitive questionnaire content using the natural language of participants. Overall, the IFG responses and thematic analyses provided a thorough evaluation of similarities and differences in cross-cultural themes, which in turn acted as a sound base for the development of new PRO questionnaires.
The Multitheoretical List of Therapeutic Interventions - 30 items (MULTI-30).
Solomonov, Nili; McCarthy, Kevin S; Gorman, Bernard S; Barber, Jacques P
2018-01-16
To develop a brief version of the Multitheoretical List of Therapeutic Interventions (MULTI-60) in order to decrease completion time burden by approximately half, while maintaining content coverage. Study 1 aimed to select 30 items. Study 2 aimed to examine the reliability and internal consistency of the MULTI-30. Study 3 aimed to validate the MULTI-30 and ensure content coverage. In Study 1, the sample included 186 therapist and 255 patient MULTI ratings, and 164 ratings of sessions coded by trained observers. Internal consistency (Chronbach's alpha and McDonald's omega) was calculated and confirmatory factor analysis was conducted. Psychotherapy experts rated content relevance. Study 2 included a sample of 644 patient and 522 therapist ratings, and 793 codings of psychotherapy sessions. In Study 3, the sample included 33 codings of sessions. A series of regression analyses was conducted to examine replication of previously published findings using the MULTI-30. The MULTI-30 was found valid, reliable, and internally consistent across 2564 ratings examined across the three studies presented. The MULTI-30 a brief and reliable process measure. Future studies are required for further validation.
Kullback Leibler divergence in complete bacterial and phage genomes
Akhter, Sajia; Kashef, Mona T.; Ibrahim, Eslam S.; Bailey, Barbara
2017-01-01
The amino acid content of the proteins encoded by a genome may predict the coding potential of that genome and may reflect lifestyle restrictions of the organism. Here, we calculated the Kullback–Leibler divergence from the mean amino acid content as a metric to compare the amino acid composition for a large set of bacterial and phage genome sequences. Using these data, we demonstrate that (i) there is a significant difference between amino acid utilization in different phylogenetic groups of bacteria and phages; (ii) many of the bacteria with the most skewed amino acid utilization profiles, or the bacteria that host phages with the most skewed profiles, are endosymbionts or parasites; (iii) the skews in the distribution are not restricted to certain metabolic processes but are common across all bacterial genomic subsystems; (iv) amino acid utilization profiles strongly correlate with GC content in bacterial genomes but very weakly correlate with the G+C percent in phage genomes. These findings might be exploited to distinguish coding from non-coding sequences in large data sets, such as metagenomic sequence libraries, to help in prioritizing subsequent analyses. PMID:29204318
Kullback Leibler divergence in complete bacterial and phage genomes.
Akhter, Sajia; Aziz, Ramy K; Kashef, Mona T; Ibrahim, Eslam S; Bailey, Barbara; Edwards, Robert A
2017-01-01
The amino acid content of the proteins encoded by a genome may predict the coding potential of that genome and may reflect lifestyle restrictions of the organism. Here, we calculated the Kullback-Leibler divergence from the mean amino acid content as a metric to compare the amino acid composition for a large set of bacterial and phage genome sequences. Using these data, we demonstrate that (i) there is a significant difference between amino acid utilization in different phylogenetic groups of bacteria and phages; (ii) many of the bacteria with the most skewed amino acid utilization profiles, or the bacteria that host phages with the most skewed profiles, are endosymbionts or parasites; (iii) the skews in the distribution are not restricted to certain metabolic processes but are common across all bacterial genomic subsystems; (iv) amino acid utilization profiles strongly correlate with GC content in bacterial genomes but very weakly correlate with the G+C percent in phage genomes. These findings might be exploited to distinguish coding from non-coding sequences in large data sets, such as metagenomic sequence libraries, to help in prioritizing subsequent analyses.
BeiDou Geostationary Satellite Code Bias Modeling Using Fengyun-3C Onboard Measurements.
Jiang, Kecai; Li, Min; Zhao, Qile; Li, Wenwen; Guo, Xiang
2017-10-27
This study validated and investigated elevation- and frequency-dependent systematic biases observed in ground-based code measurements of the Chinese BeiDou navigation satellite system, using the onboard BeiDou code measurement data from the Chinese meteorological satellite Fengyun-3C. Particularly for geostationary earth orbit satellites, sky-view coverage can be achieved over the entire elevation and azimuth angle ranges with the available onboard tracking data, which is more favorable to modeling code biases. Apart from the BeiDou-satellite-induced biases, the onboard BeiDou code multipath effects also indicate pronounced near-field systematic biases that depend only on signal frequency and the line-of-sight directions. To correct these biases, we developed a proposed code correction model by estimating the BeiDou-satellite-induced biases as linear piece-wise functions in different satellite groups and the near-field systematic biases in a grid approach. To validate the code bias model, we carried out orbit determination using single-frequency BeiDou data with and without code bias corrections applied. Orbit precision statistics indicate that those code biases can seriously degrade single-frequency orbit determination. After the correction model was applied, the orbit position errors, 3D root mean square, were reduced from 150.6 to 56.3 cm.
BeiDou Geostationary Satellite Code Bias Modeling Using Fengyun-3C Onboard Measurements
Jiang, Kecai; Li, Min; Zhao, Qile; Li, Wenwen; Guo, Xiang
2017-01-01
This study validated and investigated elevation- and frequency-dependent systematic biases observed in ground-based code measurements of the Chinese BeiDou navigation satellite system, using the onboard BeiDou code measurement data from the Chinese meteorological satellite Fengyun-3C. Particularly for geostationary earth orbit satellites, sky-view coverage can be achieved over the entire elevation and azimuth angle ranges with the available onboard tracking data, which is more favorable to modeling code biases. Apart from the BeiDou-satellite-induced biases, the onboard BeiDou code multipath effects also indicate pronounced near-field systematic biases that depend only on signal frequency and the line-of-sight directions. To correct these biases, we developed a proposed code correction model by estimating the BeiDou-satellite-induced biases as linear piece-wise functions in different satellite groups and the near-field systematic biases in a grid approach. To validate the code bias model, we carried out orbit determination using single-frequency BeiDou data with and without code bias corrections applied. Orbit precision statistics indicate that those code biases can seriously degrade single-frequency orbit determination. After the correction model was applied, the orbit position errors, 3D root mean square, were reduced from 150.6 to 56.3 cm. PMID:29076998
MHD code using multi graphical processing units: SMAUG+
NASA Astrophysics Data System (ADS)
Gyenge, N.; Griffiths, M. K.; Erdélyi, R.
2018-01-01
This paper introduces the Sheffield Magnetohydrodynamics Algorithm Using GPUs (SMAUG+), an advanced numerical code for solving magnetohydrodynamic (MHD) problems, using multi-GPU systems. Multi-GPU systems facilitate the development of accelerated codes and enable us to investigate larger model sizes and/or more detailed computational domain resolutions. This is a significant advancement over the parent single-GPU MHD code, SMAUG (Griffiths et al., 2015). Here, we demonstrate the validity of the SMAUG + code, describe the parallelisation techniques and investigate performance benchmarks. The initial configuration of the Orszag-Tang vortex simulations are distributed among 4, 16, 64 and 100 GPUs. Furthermore, different simulation box resolutions are applied: 1000 × 1000, 2044 × 2044, 4000 × 4000 and 8000 × 8000 . We also tested the code with the Brio-Wu shock tube simulations with model size of 800 employing up to 10 GPUs. Based on the test results, we observed speed ups and slow downs, depending on the granularity and the communication overhead of certain parallel tasks. The main aim of the code development is to provide massively parallel code without the memory limitation of a single GPU. By using our code, the applied model size could be significantly increased. We demonstrate that we are able to successfully compute numerically valid and large 2D MHD problems.
How to Link to Official Documents from the Government Publishing Office (GPO)
The most consistent way to present up-to-date content from the Federal Register, US Code, Code of Federal Regulations (CFR), and so on is to link to the official version of the document on the GPO's Federal Digital System (FDSys) website.
2 CFR 1.100 - Content of this title.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 2 Grants and Agreements 1 2012-01-01 2012-01-01 false Content of this title. 1.100 Section 1.100 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.100 Content of...
2 CFR 1.100 - Content of this title.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 2 Grants and Agreements 1 2014-01-01 2014-01-01 false Content of this title. 1.100 Section 1.100 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.100 Content of...
2 CFR 1.100 - Content of this title.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 2 Grants and Agreements 1 2013-01-01 2013-01-01 false Content of this title. 1.100 Section 1.100 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.100 Content of...
2 CFR 1.105 - Organization and subtitle content.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 2 Grants and Agreements 1 2010-01-01 2010-01-01 false Organization and subtitle content. 1.105 Section 1.105 Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.105 Organization and subtitle content. (a) This title is organized into...
2 CFR 1.100 - Content of this title.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 2 Grants and Agreements 1 2011-01-01 2011-01-01 false Content of this title. 1.100 Section 1.100 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.100 Content of...
2 CFR 1.100 - Content of this title.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 2 Grants and Agreements 1 2010-01-01 2010-01-01 false Content of this title. 1.100 Section 1.100 Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.100 Content of this title. This title contains— (a) Office of Management and...
Content Analysis of the "Professional School Counseling" Journal: The First Ten Years
ERIC Educational Resources Information Center
Falco, Lia D.; Bauman, Sheri; Sumnicht, Zachary; Engelstad, Alicia
2011-01-01
The authors conducted a content analysis of the articles published in the first 10 volumes of the "Professional School Counseling" (PSC) journal, beginning in October 1997 when "The School Counselor" merged with "Elementary School Counseling and Guidance". The analysis coded a total of 571 articles into 20 content categories. Findings address the…
An Examination of the Reliability of the Organizational Assessment Package (OAP).
1981-07-01
reactiv- ity or pretest sensitization (Bracht and Glass, 1968) may occur. In this case, the change from pretest to posttest can be caused just by the...content items. The blocks for supervisor’s code were left blank, work group code was coded as all ones , and each person’s seminar number was coded in...63 5 19 .91 .74 5 (Work Group Effective- ness) 822 19 .83 .42 7 17 .90 .57 7 (Job Related Sati sfacti on ) 823 16 .91 .84 2 18 .93 .87 2 (Job Related
Industry self-regulation of alcohol marketing: a systematic review of content and exposure research.
Noel, Jonathan K; Babor, Thomas F; Robaina, Katherine
2017-01-01
With governments relying increasingly upon the alcohol industry's self-regulated marketing codes to restrict alcohol marketing activity, there is a need to summarize the findings of research relevant to alcohol marketing controls. This paper provides a systematic review of studies investigating the content of, and exposure to, alcohol marketing in relation to self-regulated guidelines. Peer-reviewed papers were identified through four literature search engines: SCOPUS, Web of Science, PubMed and PsychINFO. Non-peer-reviewed reports produced by public health agencies, alcohol research centers, non-governmental organizations and government research centers were also identified. Ninety-six publications met the inclusion criteria. Of the 19 studies evaluating a specific marketing code and 25 content analysis studies reviewed, all detected content that could be considered potentially harmful to children and adolescents, including themes that appeal strongly to young men. Of the 57 studies of alcohol advertising exposure, high levels of youth exposure and high awareness of alcohol advertising were found for television, radio, print, digital and outdoor advertisements. Youth exposure to alcohol advertising has increased over time, even as greater compliance with exposure thresholds has been documented. Violations of the content guidelines within self-regulated alcohol marketing codes are highly prevalent in certain media. Exposure to alcohol marketing, particularly among youth, is also prevalent. Taken together, the findings suggest that the current self-regulatory systems that govern alcohol marketing practices are not meeting their intended goal of protecting vulnerable populations. © 2016 Society for the Study of Addiction.
Webber, C J
2001-05-01
This article shows analytically that single-cell learning rules that give rise to oriented and localized receptive fields, when their synaptic weights are randomly and independently initialized according to a plausible assumption of zero prior information, will generate visual codes that are invariant under two-dimensional translations, rotations, and scale magnifications, provided that the statistics of their training images are sufficiently invariant under these transformations. Such codes span different image locations, orientations, and size scales with equal economy. Thus, single-cell rules could account for the spatial scaling property of the cortical simple-cell code. This prediction is tested computationally by training with natural scenes; it is demonstrated that a single-cell learning rule can give rise to simple-cell receptive fields spanning the full range of orientations, image locations, and spatial frequencies (except at the extreme high and low frequencies at which the scale invariance of the statistics of digitally sampled images must ultimately break down, because of the image boundary and the finite pixel resolution). Thus, no constraint on completeness, or any other coupling between cells, is necessary to induce the visual code to span wide ranges of locations, orientations, and size scales. This prediction is made using the theory of spontaneous symmetry breaking, which we have previously shown can also explain the data-driven self-organization of a wide variety of transformation invariances in neurons' responses, such as the translation invariance of complex cell response.
Comparative Genomics and Phylogenomics of East Asian Tulips (Amana, Liliaceae)
Li, Pan; Lu, Rui-Sen; Xu, Wu-Qin; Ohi-Toma, Tetsuo; Cai, Min-Qi; Qiu, Ying-Xiong; Cameron, Kenneth M.; Fu, Cheng-Xin
2017-01-01
The genus Amana Honda (Liliaceae), when it is treated as separate from Tulipa, comprises six perennial herbaceous species that are restricted to China, Japan and the Korean Peninsula. Although all six Amana species have important medicinal and horticultural uses, studies focused on species identification and molecular phylogenetics are few. Here we report the nucleotide sequences of six complete Amana chloroplast (cp) genomes. The cp genomes of Amana range from 150,613 bp to 151,136 bp in length, all including a pair of inverted repeats (25,629–25,859 bp) separated by the large single-copy (81,482–82,218 bp) and small single-copy (17,366–17,465 bp) regions. Each cp genome equivalently contains 112 unique genes consisting of 30 transfer RNA genes, four ribosomal RNA genes, and 78 protein coding genes. Gene content, gene order, AT content, and IR/SC boundary structure are nearly identical among all Amana cp genomes. However, the relative contraction and expansion of the IR/SC borders among the six Amana cp genomes results in length variation among them. Simple sequence repeat (SSR) analyses of these Amana cp genomes indicate that the richest SSRs are A/T mononucleotides. The number of repeats among the six Amana species varies from 54 (A. anhuiensis) to 69 (Amana kuocangshanica) with palindromic (28–35) and forward repeats (23–30) as the most common types. Phylogenomic analyses based on these complete cp genomes and 74 common protein-coding genes strongly support the monophyly of the genus, and a sister relationship between Amana and Erythronium, rather than a shared common ancestor with Tulipa. Nine DNA markers (rps15–ycf1, accD–psaI, petA–psbJ, rpl32–trnL, atpH–atpI, petD–rpoA, trnS–trnG, psbM–trnD, and ycf4–cemA) with number of variable sites greater than 0.9% were identified, and these may be useful for future population genetic and phylogeographic studies of Amana species. PMID:28421090
Yi, Dong-Keun; Lee, Hae-Lim; Sun, Byung-Yun; Chung, Mi Yoon; Kim, Ki-Joong
2012-05-01
This study reports the complete chloroplast (cp) DNA sequence of Eleutherococcus senticosus (GenBank: JN 637765), an endangered endemic species. The genome is 156,768 bp in length, and contains a pair of inverted repeat (IR) regions of 25,930 bp each, a large single copy (LSC) region of 86,755 bp and a small single copy (SSC) region of 18,153 bp. The structural organization, gene and intron contents, gene order, AT content, codon usage, and transcription units of the E. senticosus chloroplast genome are similar to that of typical land plant cp DNA. We aligned and analyzed the sequences of 86 coding genes, 19 introns and 113 intergenic spacers (IGS) in three different taxonomic hierarchies; Eleutherococcus vs. Panax, Eleutherococcus vs. Daucus, and Eleutherococcus vs. Nicotiana. The distribution of indels, the number of polymorphic sites and nucleotide diversity indicate that positional constraint is more important than functional constraint for the evolution of cp genome sequences in Asterids. For example, the intron sequences in the LSC region exhibited base substitution rates 5-11-times higher than that of the IR regions, while the intron sequences in the SSC region evolved 7-14-times faster than those in the IR region. Furthermore, the Ka/Ks ratio of the gene coding sequences supports a stronger evolutionary constraint in the IR region than in the LSC or SSC regions. Therefore, our data suggest that selective sweeps by base collection mechanisms more frequently eliminate polymorphisms in the IR region than in other regions. Chloroplast genome regions that have high levels of base substitutions also show higher incidences of indels. Thirty-five simple sequence repeat (SSR) loci were identified in the Eleutherococcus chloroplast genome. Of these, 27 are homopolymers, while six are di-polymers and two are tri-polymers. In addition to the SSR loci, we also identified 18 medium size repeat units ranging from 22 to 79 bp, 11 of which are distributed in the IGS or intron regions. These medium size repeats may contribute to developing a cp genome-specific gene introduction vector because the region may use for specific recombination sites.
A korarchaeal genome reveals insights into the evolution of the Archaea
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Iain J; Elkins, James G.; Podar, Mircea
2008-06-05
The candidate division Korarchaeota comprises a group of uncultivated microorganisms that, by their small subunit rRNA phylogeny, may have diverged early from the major archaeal phyla Crenarchaeota and Euryarchaeota. Here, we report the initial characterization of a member of the Korarchaeota with the proposed name,"Candidatus Korarchaeum cryptofilum," which exhibits an ultrathin filamentous morphology. To investigate possible ancestral relationships between deep-branching Korarchaeota and other phyla, we used whole-genome shotgun sequencing to construct a complete composite korarchaeal genome from enriched cells. The genome was assembled into a single contig 1.59 Mb in length with a G + C content of 49percent. Ofmore » the 1,617 predicted protein-coding genes, 1,382 (85percent) could be assigned to a revised set of archaeal Clusters of Orthologous Groups (COGs). The predicted gene functions suggest that the organism relies on a simple mode of peptide fermentation for carbon and energy and lacks the ability to synthesize de novo purines, CoA, and several other cofactors. Phylogenetic analyses based on conserved single genes and concatenated protein sequences positioned the korarchaeote as a deep archaeal lineage with an apparent affinity to the Crenarchaeota. However, the predicted gene content revealed that several conserved cellular systems, such as cell division, DNA replication, and tRNA maturation, resemble the counterparts in the Euryarchaeota. In light of the known composition of archaeal genomes, the Korarchaeota might have retained a set of cellular features that represents the ancestral archaeal form.« less
A Korarchael Genome Reveals Insights into the Evolution of the Archaea
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lapidus, Alla; Elkins, James G.; Podar, Mircea
2008-01-07
The candidate division Korarchaeota comprises a group of uncultivated microorganisms that, by their small subunit rRNA phylogeny, may have diverged early from the major archaeal phyla Crenarchaeota and Euryarchaeota. Here, we report the initial characterization of a member of the Korarchaeota with the proposed name, ?Candidatus Korarchaeum cryptofilum,? which exhibits an ultrathin filamentous morphology. To investigate possible ancestral relationships between deep-branching Korarchaeota and other phyla, we used whole-genome shotgun sequencing to construct a complete composite korarchaeal genome from enriched cells. The genome was assembled into a single contig 1.59 Mb in length with a G + C content of 49percent.more » Of the 1,617 predicted protein-coding genes, 1,382 (85percent) could be assigned to a revised set of archaeal Clusters of Orthologous Groups (COGs). The predicted gene functions suggest that the organism relies on a simple mode of peptide fermentation for carbon and energy and lacks the ability to synthesize de novo purines, CoA, and several other cofactors. Phylogenetic analyses based on conserved single genes and concatenated protein sequences positioned the korarchaeote as a deep archaeal lineage with an apparent affinity to the Crenarchaeota. However, the predicted gene content revealed that several conserved cellular systems, such as cell division, DNA replication, and tRNA maturation, resemble the counterparts in the Euryarchaeota. In light of the known composition of archaeal genomes, the Korarchaeota might have retained a set of cellular features that represents the ancestral archaeal form.« less
Chen, Chien P; Braunstein, Steve; Mourad, Michelle; Hsu, I-Chow J; Haas-Kogan, Daphne; Roach, Mack; Fogh, Shannon E
2015-01-01
Accurate International Classification of Diseases (ICD) diagnosis coding is critical for patient care, billing purposes, and research endeavors. In this single-institution study, we evaluated our baseline ICD-9 (9th revision) diagnosis coding accuracy, identified the most common errors contributing to inaccurate coding, and implemented a multimodality strategy to improve radiation oncology coding. We prospectively studied ICD-9 coding accuracy in our radiation therapy--specific electronic medical record system. Baseline ICD-9 coding accuracy was obtained from chart review targeting ICD-9 coding accuracy of all patients treated at our institution between March and June of 2010. To improve performance an educational session highlighted common coding errors, and a user-friendly software tool, RadOnc ICD Search, version 1.0, for coding radiation oncology specific diagnoses was implemented. We then prospectively analyzed ICD-9 coding accuracy for all patients treated from July 2010 to June 2011, with the goal of maintaining 80% or higher coding accuracy. Data on coding accuracy were analyzed and fed back monthly to individual providers. Baseline coding accuracy for physicians was 463 of 661 (70%) cases. Only 46% of physicians had coding accuracy above 80%. The most common errors involved metastatic cases, whereby primary or secondary site ICD-9 codes were either incorrect or missing, and special procedures such as stereotactic radiosurgery cases. After implementing our project, overall coding accuracy rose to 92% (range, 86%-96%). The median accuracy for all physicians was 93% (range, 77%-100%) with only 1 attending having accuracy below 80%. Incorrect primary and secondary ICD-9 codes in metastatic cases showed the most significant improvement (10% vs 2% after intervention). Identifying common coding errors and implementing both education and systems changes led to significantly improved coding accuracy. This quality assurance project highlights the potential problem of ICD-9 coding accuracy by physicians and offers an approach to effectively address this shortcoming. Copyright © 2015. Published by Elsevier Inc.
Hearing sounds, understanding actions: action representation in mirror neurons.
Kohler, Evelyne; Keysers, Christian; Umiltà, M Alessandra; Fogassi, Leonardo; Gallese, Vittorio; Rizzolatti, Giacomo
2002-08-02
Many object-related actions can be recognized by their sound. We found neurons in monkey premotor cortex that discharge when the animal performs a specific action and when it hears the related sound. Most of the neurons also discharge when the monkey observes the same action. These audiovisual mirror neurons code actions independently of whether these actions are performed, heard, or seen. This discovery in the monkey homolog of Broca's area might shed light on the origin of language: audiovisual mirror neurons code abstract contents-the meaning of actions-and have the auditory access typical of human language to these contents.
The complete mitochondrial genome of Chrysopa pallens (Insecta, Neuroptera, Chrysopidae).
He, Kun; Chen, Zhe; Yu, Dan-Na; Zhang, Jia-Yong
2012-10-01
The complete mitochondrial genome of Chrysopa pallens (Neuroptera, Chrysopidae) was sequenced. It consists of 13 protein-coding genes, 22 transfer RNA genes, 2 ribosomal RNA (rRNA) genes, and a control region (AT-rich region). The total length of C. pallens mitogenome is 16,723 bp with 79.5% AT content, and the length of control region is 1905 bp with 89.1% AT content. The non-coding regions of C. pallens include control region between 12S rRNA and trnI genes, and a 75-bp space region between trnI and trnQ genes.
Biederman, Michelle K; Nelson, Megan M; Asalone, Kathryn C; Pedersen, Alyssa L; Saldanha, Colin J; Bracht, John R
2018-05-21
Developmentally programmed genome rearrangements are rare in vertebrates, but have been reported in scattered lineages including the bandicoot, hagfish, lamprey, and zebra finch (Taeniopygia guttata) [1]. In the finch, a well-studied animal model for neuroendocrinology and vocal learning [2], one such programmed genome rearrangement involves a germline-restricted chromosome, or GRC, which is found in germlines of both sexes but eliminated from mature sperm [3, 4]. Transmitted only through the oocyte, it displays uniparental female-driven inheritance, and early in embryonic development is apparently eliminated from all somatic tissue in both sexes [3, 4]. The GRC comprises the longest finch chromosome at over 120 million base pairs [3], and previously the only known GRC-derived sequence was repetitive and non-coding [5]. Because the zebra finch genome project was sourced from male muscle (somatic) tissue [6], the remaining genomic sequence and protein-coding content of the GRC remain unknown. Here we report the first protein-coding gene from the GRC: a member of the α-soluble N-ethylmaleimide sensitive fusion protein (NSF) attachment protein (α-SNAP) family hitherto missing from zebra finch gene annotations. In addition to the GRC-encoded α-SNAP, we find an additional paralogous α-SNAP residing in the somatic genome (a somatolog)-making the zebra finch the first example in which α-SNAP is not a single-copy gene. We show divergent, sex-biased expression for the paralogs and also that positive selection is detectable across the bird α-SNAP lineage, including the GRC-encoded α-SNAP. This study presents the identification and evolutionary characterization of the first protein-coding GRC gene in any organism. Copyright © 2018 Elsevier Ltd. All rights reserved.
A Brief Report on the Ethical and Legal Guides For Technology Use in Marriage and Family Therapy.
Pennington, Michael; Patton, Rikki; Ray, Amber; Katafiasz, Heather
2017-10-01
Marriage and family therapists (MFTs) use ethical codes and state licensure laws/rules as guidelines for best clinical practice. It is important that professional codes reflect the potential exponential use of technology in therapy. However, current standards regarding technology use lack clarity. To explore this gap, a summative content analysis was conducted on state licensure laws/rules and professional ethical codes to find themes and subthemes among the many aspects of therapy in which technology can be utilized. Findings from the content analysis indicated that while there have been efforts by both state and professional organizations to incorporate guidance for technology use in therapy, a clear and comprehensive "roadmap" is still missing. Future scholarship is needed that develops clearer guidelines for therapists. © 2017 American Association for Marriage and Family Therapy.
Comparison of SPHC Hydrocode Results with Penetration Equations and Results of Other Codes
NASA Technical Reports Server (NTRS)
Evans, Steven W.; Stallworth, Roderick; Stellingwerf, Robert F.
2004-01-01
The SPHC hydrodynamic code was used to simulate impacts of spherical aluminum projectiles on a single-wall aluminum plate and on a generic Whipple shield. Simulations were carried out in two and three dimensions. Projectile speeds ranged from 2 kilometers per second to 10 kilometers per second for the single-wall runs, and from 3 kilometers per second to 40 kilometers per second for the Whipple shield runs. Spallation limit results of the single-wall simulations are compared with predictions from five standard penetration equations, and are shown to fall comfortably within the envelope of these analytical relations. Ballistic limit results of the Whipple shield simulations are compared with results from the AUTODYN-2D and PAM-SHOCK-3D codes presented in a paper at the Hypervelocity Impact Symposium 2000 and the Christiansen formulation of 2003.
Spoken Narrative Assessment: A Supplementary Measure of Children's Creativity
ERIC Educational Resources Information Center
Wong, Miranda Kit-Yi; So, Wing Chee
2016-01-01
This study developed a spoken narrative (i.e., storytelling) assessment as a supplementary measure of children's creativity. Both spoken and gestural contents of children's spoken narratives were coded to assess their verbal and nonverbal creativity. The psychometric properties of the coding system for the spoken narrative assessment were…
2001-01-01
from airports and hotels, Internet cafes , libraries, and even cellular phones. This unmatched versatility has made e-mail the preferred method of...8217 contention, however, was not that the Franchise Tax Board applied the wrong section of the code; it was that the code “unfairly taxed the wife’s
Ethical Challenges in the Teaching of Multicultural Course Work
ERIC Educational Resources Information Center
Fier, Elizabeth Boyer; Ramsey, MaryLou
2005-01-01
The authors explore the ethical issues and challenges frequently encountered by counselor educators of multicultural course work. Existing ethics codes are examined, and the need for greater specificity with regard to teaching courses of multicultural content is addressed. Options for revising existing codes to better address the challenges of…
Detecting Runtime Anomalies in AJAX Applications through Trace Analysis
2011-08-10
statements by adding the instrumentation to the GWT UI classes, leaving the user code untouched. Some content management frameworks such as Drupal [12...Google web toolkit.” http://code.google.com/webtoolkit/. [12] “Form generation – drupal api.” http://api.drupal.org/api/group/form_api/6. 9
Standards for Evaluation of Instructional Materials with Respect to Social Content. 1986 Edition.
ERIC Educational Resources Information Center
California State Dept. of Education, Sacramento. Curriculum Framework and Textbook Development Unit.
The California Legislature recognized the significant place of instructional materials in the formation of a child's attitudes and beliefs when it adopted "Educational Code" sections 60040 through 60044. The "Education Code" sections referred to in this document are intended to help dispel negative stereotypes by emphasizing…
Turbomachinery Heat Transfer and Loss Modeling for 3D Navier-Stokes Codes
NASA Technical Reports Server (NTRS)
DeWitt, Kenneth; Ameri, Ali
2005-01-01
This report's contents focus on making use of NASA Glenn on-site computational facilities,to develop, validate, and apply models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes to enhance the capability to compute heat transfer and losses in turbomachiney.
A Coding Scheme to Analyse the Online Asynchronous Discussion Forums of University Students
ERIC Educational Resources Information Center
Biasutti, Michele
2017-01-01
The current study describes the development of a content analysis coding scheme to examine transcripts of online asynchronous discussion groups in higher education. The theoretical framework comprises the theories regarding knowledge construction in computer-supported collaborative learning (CSCL) based on a sociocultural perspective. The coding…
Social media use by community-based organizations conducting health promotion: a content analysis.
Ramanadhan, Shoba; Mendez, Samuel R; Rao, Megan; Viswanath, Kasisomayajula
2013-12-05
Community-based organizations (CBOs) are critical channels for the delivery of health promotion programs. Much of their influence comes from the relationships they have with community members and other key stakeholders and they may be able to harness the power of social media tools to develop and maintain these relationships. There are limited data describing if and how CBOs are using social media. This study assesses the extent to which CBOs engaged in health promotion use popular social media channels, the types of content typically shared, and the extent to which the interactive aspects of social media tools are utilized. We assessed the social media presence and patterns of usage of CBOs engaged in health promotion in Boston, Lawrence, and Worcester, Massachusetts. We coded content on three popular channels: Facebook, Twitter, and YouTube. We used content analysis techniques to quantitatively summarize posts, tweets, and videos on these channels, respectively. For each organization, we coded all content put forth by the CBO on the three channels in a 30-day window. Two coders were trained and conducted the coding. Data were collected between November 2011 and January 2012. A total of 166 organizations were included in our census. We found that 42% of organizations used at least one of the channels of interest. Across the three channels, organization promotion was the most common theme for content (66% of posts, 63% of tweets, and 93% of videos included this content). Most organizations updated Facebook and Twitter content at rates close to recommended frequencies. We found limited interaction/engagement with audience members. Much of the use of social media tools appeared to be uni-directional, a flow of information from the organization to the audience. By better leveraging opportunities for interaction and user engagement, these organizations can reap greater benefits from the non-trivial investment required to use social media well. Future research should assess links between use patterns and organizational characteristics, staff perspectives, and audience engagement.
Mistranslation: from adaptations to applications.
Hoffman, Kyle S; O'Donoghue, Patrick; Brandl, Christopher J
2017-11-01
The conservation of the genetic code indicates that there was a single origin, but like all genetic material, the cell's interpretation of the code is subject to evolutionary pressure. Single nucleotide variations in tRNA sequences can modulate codon assignments by altering codon-anticodon pairing or tRNA charging. Either can increase translation errors and even change the code. The frozen accident hypothesis argued that changes to the code would destabilize the proteome and reduce fitness. In studies of model organisms, mistranslation often acts as an adaptive response. These studies reveal evolutionary conserved mechanisms to maintain proteostasis even during high rates of mistranslation. This review discusses the evolutionary basis of altered genetic codes, how mistranslation is identified, and how deviations to the genetic code are exploited. We revisit early discoveries of genetic code deviations and provide examples of adaptive mistranslation events in nature. Lastly, we highlight innovations in synthetic biology to expand the genetic code. The genetic code is still evolving. Mistranslation increases proteomic diversity that enables cells to survive stress conditions or suppress a deleterious allele. Genetic code variants have been identified by genome and metagenome sequence analyses, suppressor genetics, and biochemical characterization. Understanding the mechanisms of translation and genetic code deviations enables the design of new codes to produce novel proteins. Engineering the translation machinery and expanding the genetic code to incorporate non-canonical amino acids are valuable tools in synthetic biology that are impacting biomedical research. This article is part of a Special Issue entitled "Biochemistry of Synthetic Biology - Recent Developments" Guest Editor: Dr. Ilka Heinemann and Dr. Patrick O'Donoghue. Copyright © 2017 Elsevier B.V. All rights reserved.
LDPC coded OFDM over the atmospheric turbulence channel.
Djordjevic, Ivan B; Vasic, Bane; Neifeld, Mark A
2007-05-14
Low-density parity-check (LDPC) coded optical orthogonal frequency division multiplexing (OFDM) is shown to significantly outperform LDPC coded on-off keying (OOK) over the atmospheric turbulence channel in terms of both coding gain and spectral efficiency. In the regime of strong turbulence at a bit-error rate of 10(-5), the coding gain improvement of the LDPC coded single-side band unclipped-OFDM system with 64 sub-carriers is larger than the coding gain of the LDPC coded OOK system by 20.2 dB for quadrature-phase-shift keying (QPSK) and by 23.4 dB for binary-phase-shift keying (BPSK).
New double-byte error-correcting codes for memory systems
NASA Technical Reports Server (NTRS)
Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.
1996-01-01
Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.
Genomics dataset on unclassified published organism (patent US 7547531).
Khan Shawan, Mohammad Mahfuz Ali; Hasan, Md Ashraful; Hossain, Md Mozammel; Hasan, Md Mahmudul; Parvin, Afroza; Akter, Salina; Uddin, Kazi Rasel; Banik, Subrata; Morshed, Mahbubul; Rahman, Md Nazibur; Rahman, S M Badier
2016-12-01
Nucleotide (DNA) sequence analysis provides important clues regarding the characteristics and taxonomic position of an organism. With the intention that, DNA sequence analysis is very crucial to learn about hierarchical classification of that particular organism. This dataset (patent US 7547531) is chosen to simplify all the complex raw data buried in undisclosed DNA sequences which help to open doors for new collaborations. In this data, a total of 48 unidentified DNA sequences from patent US 7547531 were selected and their complete sequences were retrieved from NCBI BioSample database. Quick response (QR) code of those DNA sequences was constructed by DNA BarID tool. QR code is useful for the identification and comparison of isolates with other organisms. AT/GC content of the DNA sequences was determined using ENDMEMO GC Content Calculator, which indicates their stability at different temperature. The highest GC content was observed in GP445188 (62.5%) which was followed by GP445198 (61.8%) and GP445189 (59.44%), while lowest was in GP445178 (24.39%). In addition, New England BioLabs (NEB) database was used to identify cleavage code indicating the 5, 3 and blunt end and enzyme code indicating the methylation site of the DNA sequences was also shown. These data will be helpful for the construction of the organisms' hierarchical classification, determination of their phylogenetic and taxonomic position and revelation of their molecular characteristics.
NASA Technical Reports Server (NTRS)
Lin, Shu; Rhee, Dojun
1996-01-01
This paper is concerned with construction of multilevel concatenated block modulation codes using a multi-level concatenation scheme for the frequency non-selective Rayleigh fading channel. In the construction of multilevel concatenated modulation code, block modulation codes are used as the inner codes. Various types of codes (block or convolutional, binary or nonbinary) are being considered as the outer codes. In particular, we focus on the special case for which Reed-Solomon (RS) codes are used as the outer codes. For this special case, a systematic algebraic technique for constructing q-level concatenated block modulation codes is proposed. Codes have been constructed for certain specific values of q and compared with the single-level concatenated block modulation codes using the same inner codes. A multilevel closest coset decoding scheme for these codes is proposed.
Abrahams, Sheryl W
2012-08-01
The advent of social networking sites and other online communities presents new opportunities and challenges for the promotion, protection, and support of breastfeeding. This study examines the presence of infant formula marketing on popular US social media sites, using the World Health Organization International Code of Marketing of Breast-milk Substitutes (the Code) as a framework. We examined to what extent each of 11 infant formula brands that are widely available in the US had established a social media presence in popular social media venues likely to be visited by expectant parents and families with young children. We then examined current marketing practices, using the Code as a basis for ethical marketing. Infant formula manufacturers have established a social media presence primarily through Facebook pages, interactive features on their own Web sites, mobile apps for new and expecting parents, YouTube videos, sponsored reviews on parenting blogs, and other financial relationships with parenting blogs. Violations of the Code as well as promotional practices unforeseen by the Code were identified. These practices included enabling user-generated content that promotes the use of infant formula, financial relationships between manufacturers and bloggers, and creation of mobile apps for use by parents. An additional concern identified for Code enforcement is lack of transparency in social media-based marketing. The use of social media for formula marketing may demand new strategies for monitoring and enforcing the Code in light of emerging challenges, including suggested content for upcoming consideration for World Health Assembly resolutions.
Single-shot secure quantum network coding on butterfly network with free public communication
NASA Astrophysics Data System (ADS)
Owari, Masaki; Kato, Go; Hayashi, Masahito
2018-01-01
Quantum network coding on the butterfly network has been studied as a typical example of quantum multiple cast network. We propose a secure quantum network code for the butterfly network with free public classical communication in the multiple unicast setting under restricted eavesdropper’s power. This protocol certainly transmits quantum states when there is no attack. We also show the secrecy with shared randomness as additional resource when the eavesdropper wiretaps one of the channels in the butterfly network and also derives the information sending through public classical communication. Our protocol does not require verification process, which ensures single-shot security.
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Srivastava, R.
1996-01-01
This guide describes the input data required for using MSAP2D (Multi Stage Aeroelastic analysis Program - Two Dimensional) computer code. MSAP2D can be used for steady, unsteady aerodynamic, and aeroelastic (flutter and forced response) analysis of bladed disks arranged in multiple blade rows such as those found in compressors, turbines, counter rotating propellers or propfans. The code can also be run for single blade row. MSAP2D code is an extension of the original NPHASE code for multiblade row aerodynamic and aeroelastic analysis. Euler equations are used to obtain aerodynamic forces. The structural dynamic equations are written for a rigid typical section undergoing pitching (torsion) and plunging (bending) motion. The aeroelastic equations are solved in time domain. For single blade row analysis, frequency domain analysis is also provided to obtain unsteady aerodynamic coefficients required in an eigen analysis for flutter. In this manual, sample input and output are provided for a single blade row example, two blade row example with equal and unequal number of blades in the blade rows.
Dharmaraj, Christopher D; Thadikonda, Kishan; Fletcher, Anthony R; Doan, Phuc N; Devasahayam, Nallathamby; Matsumoto, Shingo; Johnson, Calvin A; Cook, John A; Mitchell, James B; Subramanian, Sankaran; Krishna, Murali C
2009-01-01
Three-dimensional Oximetric Electron Paramagnetic Resonance Imaging using the Single Point Imaging modality generates unpaired spin density and oxygen images that can readily distinguish between normal and tumor tissues in small animals. It is also possible with fast imaging to track the changes in tissue oxygenation in response to the oxygen content in the breathing air. However, this involves dealing with gigabytes of data for each 3D oximetric imaging experiment involving digital band pass filtering and background noise subtraction, followed by 3D Fourier reconstruction. This process is rather slow in a conventional uniprocessor system. This paper presents a parallelization framework using OpenMP runtime support and parallel MATLAB to execute such computationally intensive programs. The Intel compiler is used to develop a parallel C++ code based on OpenMP. The code is executed on four Dual-Core AMD Opteron shared memory processors, to reduce the computational burden of the filtration task significantly. The results show that the parallel code for filtration has achieved a speed up factor of 46.66 as against the equivalent serial MATLAB code. In addition, a parallel MATLAB code has been developed to perform 3D Fourier reconstruction. Speedup factors of 4.57 and 4.25 have been achieved during the reconstruction process and oximetry computation, for a data set with 23 x 23 x 23 gradient steps. The execution time has been computed for both the serial and parallel implementations using different dimensions of the data and presented for comparison. The reported system has been designed to be easily accessible even from low-cost personal computers through local internet (NIHnet). The experimental results demonstrate that the parallel computing provides a source of high computational power to obtain biophysical parameters from 3D EPR oximetric imaging, almost in real-time.
Rusu, Adina C; Hallner, Dirk
2018-06-06
Depression is a common feature of chronic pain, but there is only limited research into the content and frequency of depressed cognitions in pain patients. This study describes the development of the Sentence Completion Test for Chronic Pain (SCP), an idiographic measure for assessing depressive thinking in chronic pain patients. The sentence completion task requires participants to finish incomplete sentences using their own words to a set of predefined stems that include negative, positive and neutral valenced self-referenced words. In addition, the stems include past, future and world stems, which reflect the theoretical negative triad typical to depression. Complete responses are coded by valence (negative, positive and neutral), pain and health-related content. A total of 89 participants were included in this study. Forty seven adult out-patients formed the depressed pain group and were compared to a non-clinical control sample of 42 healthy control participants. This study comprised several phases: (1) theory-driven generation of coding rules; (2) the development of a coding manual by a panel of experts (3) comparing reliability of coding by expert raters without the use of the coding manual and with the use of the coding manual; (4) preliminary analyses of the construct validity of the SCP. The internal consistency of the SCP was tested using the Kuder-Richardson coefficient (KR-20). Inter-rater agreement was assessed by intra-class correlations (ICC). The content and construct validity of the SCP was investigated by correlation coefficients between SCP negative completions, the Hospital Anxiety and Depression Scale (HADS) depression scores and the number of symptoms on the Structured Clinical Interview for DSM-IV-TR (SCID). As predicted for content validity, the number of SCP negative statements was significantly greater in the depressed pain group and this group also produced significantly fewer positive statements, compared to the healthy control group. The number of negative pain completions and negative health completions was significantly greater in the depressed pain group. As expected, in the depressed pain group, the correlation between SCP negatives and the HADS Depression score was r=0.60 and the correlation between SCP negatives and the number of symptoms on the SCID was r=0.56. The SCP demonstrated good content validity, internal consistency and inter-rater reliability. Uses for this measure, such as complementing questionnaire measures by an idiographic assessment of depressive thinking and generating hypotheses about key problems within a cognitive-behavioural case-formulation, are suggested.
ALEGRA -- A massively parallel h-adaptive code for solid dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Summers, R.M.; Wong, M.K.; Boucheron, E.A.
1997-12-31
ALEGRA is a multi-material, arbitrary-Lagrangian-Eulerian (ALE) code for solid dynamics designed to run on massively parallel (MP) computers. It combines the features of modern Eulerian shock codes, such as CTH, with modern Lagrangian structural analysis codes using an unstructured grid. ALEGRA is being developed for use on the teraflop supercomputers to conduct advanced three-dimensional (3D) simulations of shock phenomena important to a variety of systems. ALEGRA was designed with the Single Program Multiple Data (SPMD) paradigm, in which the mesh is decomposed into sub-meshes so that each processor gets a single sub-mesh with approximately the same number of elements. Usingmore » this approach the authors have been able to produce a single code that can scale from one processor to thousands of processors. A current major effort is to develop efficient, high precision simulation capabilities for ALEGRA, without the computational cost of using a global highly resolved mesh, through flexible, robust h-adaptivity of finite elements. H-adaptivity is the dynamic refinement of the mesh by subdividing elements, thus changing the characteristic element size and reducing numerical error. The authors are working on several major technical challenges that must be met to make effective use of HAMMER on MP computers.« less
Deep Hashing for Scalable Image Search.
Lu, Jiwen; Liong, Venice Erin; Zhou, Jie
2017-05-01
In this paper, we propose a new deep hashing (DH) approach to learn compact binary codes for scalable image search. Unlike most existing binary codes learning methods, which usually seek a single linear projection to map each sample into a binary feature vector, we develop a deep neural network to seek multiple hierarchical non-linear transformations to learn these binary codes, so that the non-linear relationship of samples can be well exploited. Our model is learned under three constraints at the top layer of the developed deep network: 1) the loss between the compact real-valued code and the learned binary vector is minimized, 2) the binary codes distribute evenly on each bit, and 3) different bits are as independent as possible. To further improve the discriminative power of the learned binary codes, we extend DH into supervised DH (SDH) and multi-label SDH by including a discriminative term into the objective function of DH, which simultaneously maximizes the inter-class variations and minimizes the intra-class variations of the learned binary codes with the single-label and multi-label settings, respectively. Extensive experimental results on eight widely used image search data sets show that our proposed methods achieve very competitive results with the state-of-the-arts.
McCoy, Thomas H; Castro, Victor M; Snapper, Leslie A; Hart, Kamber L; Perlis, Roy H
2017-08-31
Biobanks and national registries represent a powerful tool for genomic discovery, but rely on diagnostic codes that may be unreliable and fail to capture the relationship between related diagnoses. We developed an efficient means of conducting genome-wide association studies using combinations of diagnostic codes from electronic health records (EHR) for 10845 participants in a biobanking program at two large academic medical centers. Specifically, we applied latent Dirichilet allocation to fit 50 disease topics based on diagnostic codes, then conducted genome-wide common-variant association for each topic. In sensitivity analysis, these results were contrasted with those obtained from traditional single-diagnosis phenome-wide association analysis, as well as those in which only a subset of diagnostic codes are included per topic. In meta-analysis across three biobank cohorts, we identified 23 disease-associated loci with p<1e-15, including previously associated autoimmune disease loci. In all cases, observed significant associations were of greater magnitude than for single phenome-wide diagnostic codes, and incorporation of less strongly-loading diagnostic codes enhanced association. This strategy provides a more efficient means of phenome-wide association in biobanks with coded clinical data.
McCoy, Thomas H; Castro, Victor M; Snapper, Leslie A; Hart, Kamber L; Perlis, Roy H
2017-01-01
Biobanks and national registries represent a powerful tool for genomic discovery, but rely on diagnostic codes that can be unreliable and fail to capture relationships between related diagnoses. We developed an efficient means of conducting genome-wide association studies using combinations of diagnostic codes from electronic health records for 10,845 participants in a biobanking program at two large academic medical centers. Specifically, we applied latent Dirichilet allocation to fit 50 disease topics based on diagnostic codes, then conducted a genome-wide common-variant association for each topic. In sensitivity analysis, these results were contrasted with those obtained from traditional single-diagnosis phenome-wide association analysis, as well as those in which only a subset of diagnostic codes were included per topic. In meta-analysis across three biobank cohorts, we identified 23 disease-associated loci with p < 1e-15, including previously associated autoimmune disease loci. In all cases, observed significant associations were of greater magnitude than single phenome-wide diagnostic codes, and incorporation of less strongly loading diagnostic codes enhanced association. This strategy provides a more efficient means of identifying phenome-wide associations in biobanks with coded clinical data. PMID:28861588
Public sentiment and discourse about Zika virus on Instagram.
Seltzer, E K; Horst-Martz, E; Lu, M; Merchant, R M
2017-09-01
Social media have strongly influenced the awareness and perceptions of public health emergencies, and a considerable amount of social media content is now shared through images, rather than text alone. This content can impact preparedness and response due to the popularity and real-time nature of social media platforms. We sought to explore how the image-sharing platform Instagram is used for information dissemination and conversation during the current Zika outbreak. This was a retrospective review of publicly posted images about Zika on Instagram. Using the keyword '#zika' we identified 500 images posted on Instagram from May to August 2016. Images were coded by three reviewers and contextual information was collected for each image about sentiment, image type, content, audience, geography, reliability, and engagement. Of 500 images tagged with #zika, 342 (68%) contained content actually related to Zika. Of the 342 Zika-specific images, 299 were coded as 'health' and 193 were coded 'public interest'. Some images had multiple 'health' and 'public interest' codes. Health images tagged with #zika were primarily related to transmission (43%, 129/299) and prevention (48%, 145/299). Transmission-related posts were more often mosquito-human transmission (73%, 94/129) than human-human transmission (27%, 35/129). Mosquito bite prevention posts outnumbered safe sex prevention; (84%, 122/145) and (16%, 23/145) respectively. Images with a target audience were primarily aimed at women (95%, 36/38). Many posts (60%, 61/101) included misleading, incomplete, or unclear information about the virus. Additionally, many images expressed fear and negative sentiment, (79/156, 51%). Instagram can be used to characterize public sentiment and highlight areas of focus for public health, such as correcting misleading or incomplete information or expanding messages to reach diverse audiences. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Code-Switching and Competition: An Examination of a Situational Response
ERIC Educational Resources Information Center
Bernstein, Eve; Herman, Ariela
2014-01-01
Code switching is primarily a linguistic term that refers to the use of two or more languages within the same conversation, or same sentence, to convey a single message. One field of linguistics, sociocultural linguistics, is broad and interdisciplinary, a mixture of language, culture, and society. In sociocultural linguistics, the code, or…
Puckett, Larry; Hitt, Kerie; Alexander, Richard
1998-01-01
names that correspond to the FIPS codes. 2. Tabular component - Nine tab-delimited ASCII lookup tables of animal counts and nutrient estimates organized by 5-digit state/county FIPS (Federal Information Processing Standards) code. Another table lists the county names that correspond to the FIPS codes. The use of trade names is for identification purposes only and does not constitute endorsement by the U.S. Geological Survey.
Optimum Vessel Performance in Evolving Nonlinear Wave Fields
2012-11-01
TEMPEST , the new, nonlinear, time-domain ship motion code being developed by the Navy. Table of Contents Executive Summary i List of Figures iii...domain ship motion code TEMPEST . The radiation and diffraction forces in the level 3.0 version of TEMPEST will be computed by the body-exact strip theory...nonlinear responses of a ship to a seaway are being incorporated into version 3 of TEMPEST , the new, nonlinear, time-domain ship motion code that
Exact Solutions for Rate and Synchrony in Recurrent Networks of Coincidence Detectors
Mikula, Shawn; Niebur, Ernst
2009-01-01
We provide analytical solutions for mean firing rates and cross-correlations of coincidence detector neurons in recurrent networks with excitatory or inhibitory connectivity with rate-modulated steady-state spiking inputs. We use discrete-time finite-state Markov chains to represent network state transition probabilities, which are subsequently used to derive exact analytical solutions for mean firing rates and cross-correlations. As illustrated in several examples, the method can be used for modeling cortical microcircuits and clarifying single-neuron and population coding mechanisms. We also demonstrate that increasing firing rates do not necessarily translate into increasing cross-correlations, though our results do support the contention that firing rates and cross-correlations are likely to be coupled. Our analytical solutions underscore the complexity of the relationship between firing rates and cross-correlations. PMID:18439133
Shuttle Case Study Collection Website Development
NASA Technical Reports Server (NTRS)
Ransom, Khadijah S.; Johnson, Grace K.
2012-01-01
As a continuation from summer 2012, the Shuttle Case Study Collection has been developed using lessons learned documented by NASA engineers, analysts, and contractors. Decades of information related to processing and launching the Space Shuttle is gathered into a single database to provide educators with an alternative means to teach real-world engineering processes. The goal is to provide additional engineering materials that enhance critical thinking, decision making, and problem solving skills. During this second phase of the project, the Shuttle Case Study Collection website was developed. Extensive HTML coding to link downloadable documents, videos, and images was required, as was training to learn NASA's Content Management System (CMS) for website design. As the final stage of the collection development, the website is designed to allow for distribution of information to the public as well as for case study report submissions from other educators online.
2011-01-01
Background The melon belongs to the Cucurbitaceae family, whose economic importance among vegetable crops is second only to Solanaceae. The melon has a small genome size (454 Mb), which makes it suitable for molecular and genetic studies. Despite similar nuclear and chloroplast genome sizes, cucurbits show great variation when their mitochondrial genomes are compared. The melon possesses the largest plant mitochondrial genome, as much as eight times larger than that of other cucurbits. Results The nucleotide sequences of the melon chloroplast and mitochondrial genomes were determined. The chloroplast genome (156,017 bp) included 132 genes, with 98 single-copy genes dispersed between the small (SSC) and large (LSC) single-copy regions and 17 duplicated genes in the inverted repeat regions (IRa and IRb). A comparison of the cucumber and melon chloroplast genomes showed differences in only approximately 5% of nucleotides, mainly due to short indels and SNPs. Additionally, 2.74 Mb of mitochondrial sequence, accounting for 95% of the estimated mitochondrial genome size, were assembled into five scaffolds and four additional unscaffolded contigs. An 84% of the mitochondrial genome is contained in a single scaffold. The gene-coding region accounted for 1.7% (45,926 bp) of the total sequence, including 51 protein-coding genes, 4 conserved ORFs, 3 rRNA genes and 24 tRNA genes. Despite the differences observed in the mitochondrial genome sizes of cucurbit species, Citrullus lanatus (379 kb), Cucurbita pepo (983 kb) and Cucumis melo (2,740 kb) share 120 kb of sequence, including the predicted protein-coding regions. Nevertheless, melon contained a high number of repetitive sequences and a high content of DNA of nuclear origin, which represented 42% and 47% of the total sequence, respectively. Conclusions Whereas the size and gene organisation of chloroplast genomes are similar among the cucurbit species, mitochondrial genomes show a wide variety of sizes, with a non-conserved structure both in gene number and organisation, as well as in the features of the noncoding DNA. The transfer of nuclear DNA to the melon mitochondrial genome and the high proportion of repetitive DNA appear to explain the size of the largest mitochondrial genome reported so far. PMID:21854637
The feasibility of QR-code prescription in Taiwan.
Lin, C-H; Tsai, F-Y; Tsai, W-L; Wen, H-W; Hu, M-L
2012-12-01
An ideal Health Care Service is a service system that focuses on patients. Patients in Taiwan have the freedom to fill their prescriptions at any pharmacies contracted with National Health Insurance. Each of these pharmacies uses its own computer system. So far, there are at least ten different systems on the market in Taiwan. To transmit the prescription information from the hospital to the pharmacy accurately and efficiently presents a great issue. This study consisted of two-dimensional applications using a QR-code to capture Patient's identification and prescription information from the hospitals as well as using a webcam to read the QR-code and transfer all data to the pharmacy computer system. Two hospitals and 85 community pharmacies participated in the study. During the trial, all participant pharmacies appraised highly of the accurate transmission of the prescription information. The contents in QR-code prescriptions from Taipei area were picked up efficiently and accurately in pharmacies at Taichung area (middle Taiwan) without software system limit and area limitation. The QR-code device received a patent (No. M376844, March 2010) from Intellectual Property Office Ministry of Economic Affair, China. Our trial has proven that QR-code prescription can provide community pharmacists an efficient, accurate and inexpensive device to digitalize the prescription contents. Consequently, pharmacists can offer better quality of pharmacy service to patients. © 2012 Blackwell Publishing Ltd.
Kassam, Aliya; Sharma, Nishan; Harvie, Margot; O’Beirne, Maeve; Topps, Maureen
2016-01-01
Abstract Objective To conduct a thematic analysis of the College of Family Physicians of Canada’s (CFPC’s) Red Book accreditation standards and the Triple C Competency-based Curriculum objectives with respect to patient safety principles. Design Thematic content analysis of the CFPC’s Red Book accreditation standards and the Triple C curriculum. Setting Canada. Main outcome measures Coding frequency of the patient safety principles (ie, patient engagement; respectful, transparent relationships; complex systems; a just and trusting culture; responsibility and accountability for actions; and continuous learning and improvement) found in the analyzed CFPC documents. Results Within the analyzed CFPC documents, the most commonly found patient safety principle was patient engagement (n = 51 coding references); the least commonly found patient safety principles were a just and trusting culture (n = 5 coding references) and complex systems (n = 5 coding references). Other patient safety principles that were uncommon included responsibility and accountability for actions (n = 7 coding references) and continuous learning and improvement (n = 12 coding references). Conclusion Explicit inclusion of patient safety content such as the use of patient safety principles is needed for residency training programs across Canada to ensure the full spectrum of care is addressed, from community-based care to acute hospital-based care. This will ensure a patient safety culture can be cultivated from residency and sustained into primary care practice. PMID:27965349
A new framework for interactive quality assessment with application to light field coding
NASA Astrophysics Data System (ADS)
Viola, Irene; Ebrahimi, Touradj
2017-09-01
In recent years, light field has experienced a surge of popularity, mainly due to the recent advances in acquisition and rendering technologies that have made it more accessible to the public. Thanks to image-based rendering techniques, light field contents can be rendered in real time on common 2D screens, allowing virtual navigation through the captured scenes in an interactive fashion. However, this richer representation of the scene poses the problem of reliable quality assessments for light field contents. In particular, while subjective methodologies that enable interaction have already been proposed, no work has been done on assessing how users interact with light field contents. In this paper, we propose a new framework to subjectively assess the quality of light field contents in an interactive manner and simultaneously track users behaviour. The framework is successfully used to perform subjective assessment of two coding solutions. Moreover, statistical analysis performed on the results shows interesting correlation between subjective scores and average interaction time.
A finite-temperature Hartree-Fock code for shell-model Hamiltonians
NASA Astrophysics Data System (ADS)
Bertsch, G. F.; Mehlhaff, J. M.
2016-10-01
The codes HFgradZ.py and HFgradT.py find axially symmetric minima of a Hartree-Fock energy functional for a Hamiltonian supplied in a shell model basis. The functional to be minimized is the Hartree-Fock energy for zero-temperature properties or the Hartree-Fock grand potential for finite-temperature properties (thermal energy, entropy). The minimization may be subjected to additional constraints besides axial symmetry and nucleon numbers. A single-particle operator can be used to constrain the minimization by adding it to the single-particle Hamiltonian with a Lagrange multiplier. One can also constrain its expectation value in the zero-temperature code. Also the orbital filling can be constrained in the zero-temperature code, fixing the number of nucleons having given Kπ quantum numbers. This is particularly useful to resolve near-degeneracies among distinct minima.
Multi-processing on supercomputers for computational aerodynamics
NASA Technical Reports Server (NTRS)
Yarrow, Maurice; Mehta, Unmeel B.
1990-01-01
The MIMD concept is applied, through multitasking, with relatively minor modifications to an existing code for a single processor. This approach maps the available memory to multiple processors, exploiting the C-FORTRAN-Unix interface. An existing single processor algorithm is mapped without the need for developing a new algorithm. The procedure of designing a code utilizing this approach is automated with the Unix stream editor. A Multiple Processor Multiple Grid (MPMG) code is developed as a demonstration of this approach. This code solves the three-dimensional, Reynolds-averaged, thin-layer and slender-layer Navier-Stokes equations with an implicit, approximately factored and diagonalized method. This solver is applied to a generic, oblique-wing aircraft problem on a four-processor computer using one process for data management and nonparallel computations and three processes for pseudotime advance on three different grid systems.
Fourier phase retrieval with a single mask by Douglas-Rachford algorithms.
Chen, Pengwen; Fannjiang, Albert
2018-05-01
The Fourier-domain Douglas-Rachford (FDR) algorithm is analyzed for phase retrieval with a single random mask. Since the uniqueness of phase retrieval solution requires more than a single oversampled coded diffraction pattern, the extra information is imposed in either of the following forms: 1) the sector condition on the object; 2) another oversampled diffraction pattern, coded or uncoded. For both settings, the uniqueness of projected fixed point is proved and for setting 2) the local, geometric convergence is derived with a rate given by a spectral gap condition. Numerical experiments demonstrate global, power-law convergence of FDR from arbitrary initialization for both settings as well as for 3 or more coded diffraction patterns without oversampling. In practice, the geometric convergence can be recovered from the power-law regime by a simple projection trick, resulting in highly accurate reconstruction from generic initialization.
A Struggle for Dominance: Relational Communication Messages in Television Programming.
ERIC Educational Resources Information Center
Barbatsis, Gretchen S.; And Others
Television's messages about sex role behavior were analyzed by collecting and coding spot samples of the ten top ranked programs in prime viewing time and proportionate numbers of daytime soap operas and Saturday morning children's programs. The content analysis was based on a relational coding system developed to assess interpersonal…
ERIC Educational Resources Information Center
Bantum, Erin O'Carroll; Owen, Jason E.
2009-01-01
Psychological interventions provide linguistic data that are particularly useful for testing mechanisms of action and improving intervention methodologies. For this study, emotional expression in an Internet-based intervention for women with breast cancer (n = 63) was analyzed via rater coding and 2 computerized coding methods (Linguistic Inquiry…
Teachers' Code-Switching in Bilingual Classrooms: Exploring Pedagogical and Sociocultural Functions
ERIC Educational Resources Information Center
Cahyani, Hilda; de Courcy, Michele; Barnett, Jenny
2018-01-01
The pedagogical and sociocultural functions of teachers' code-switching are an important factor in achieving the dual goals of content learning and language learning in bilingual programmes. This paper reports on an ethnographic case study investigating how and why teachers switched between languages in tertiary bilingual classrooms in Indonesia,…
ERIC Educational Resources Information Center
Piehler, Timothy F.; Dishion, Thomas J.
2014-01-01
In a sample of 711 ethnically diverse adolescents, the observed interpersonal dynamics of dyadic adolescent friendship interactions were coded to predict early adulthood tobacco, alcohol, and marijuana use. Deviant discussion content within the interactions was coded along with dyadic coregulation (i.e., interpersonal coordination, attention…
Server-Side Includes Made Simple.
ERIC Educational Resources Information Center
Fagan, Jody Condit
2002-01-01
Describes server-side include (SSI) codes which allow Webmasters to insert content into Web pages without programming knowledge. Explains how to enable the codes on a Web server, provides a step-by-step process for implementing them, discusses tags and syntax errors, and includes examples of their use on the Web site for Southern Illinois…
FOG: Fighting the Achilles' Heel of Gossip Protocols with Fountain Codes
NASA Astrophysics Data System (ADS)
Champel, Mary-Luc; Kermarrec, Anne-Marie; Le Scouarnec, Nicolas
Gossip protocols are well known to provide reliable and robust dissemination protocols in highly dynamic systems. Yet, they suffer from high redundancy in the last phase of the dissemination. In this paper, we combine fountain codes (rateless erasure-correcting codes) together with gossip protocols for a robust and fast content dissemination in large-scale dynamic systems. The use of fountain enables to eliminate the unnecessary redundancy of gossip protocols. We propose the design of FOG, which fully exploits the first exponential growth phase (where the data is disseminated exponentially fast) of gossip protocols while avoiding the need for the shrinking phase by using fountain codes. FOG voluntarily increases the number of disseminations but limits those disseminations to the exponential growth phase. In addition, FOG creates a split-graph overlay that splits the peers between encoders and forwarders. Forwarder peers become encoders as soon as they have received the whole content. In order to benefit even further and quicker from encoders, FOG biases the dissemination towards the most advanced peers to make them complete earlier.
Size principle and information theory.
Senn, W; Wyler, K; Clamann, H P; Kleinle, J; Lüscher, H R; Müller, L
1997-01-01
The motor units of a skeletal muscle may be recruited according to different strategies. From all possible recruitment strategies nature selected the simplest one: in most actions of vertebrate skeletal muscles the recruitment of its motor units is by increasing size. This so-called size principle permits a high precision in muscle force generation since small muscle forces are produced exclusively by small motor units. Larger motor units are activated only if the total muscle force has already reached certain critical levels. We show that this recruitment by size is not only optimal in precision but also optimal in an information theoretical sense. We consider the motoneuron pool as an encoder generating a parallel binary code from a common input to that pool. The generated motoneuron code is sent down through the motoneuron axons to the muscle. We establish that an optimization of this motoneuron code with respect to its information content is equivalent to the recruitment of motor units by size. Moreover, maximal information content of the motoneuron code is equivalent to a minimal expected error in muscle force generation.
Berry, Nina J; Gribble, Karleen D
2017-10-01
The use of health and nutrition content claims in infant formula advertising is restricted by many governments in response to WHO policies and WHA resolutions. The purpose of this study was to determine whether such prohibited claims could be observed in Australian websites that advertise infant formula products. A comprehensive internet search was conducted to identify websites that advertise infant formula available for purchase in Australia. Content analysis was used to identify prohibited claims. The coding frame was closely aligned with the provisions of the Australian and New Zealand Food Standard Code, which prohibits these claims. The outcome measures were the presence of health claims, nutrition content claims, or references to the nutritional content of human milk. Web pages advertising 25 unique infant formula products available for purchase in Australia were identified. Every advertisement (100%) contained at least one health claim. Eighteen (72%) also contained at least one nutrition content claim. Three web pages (12%) advertising brands associated with infant formula products referenced the nutritional content of human milk. All of these claims appear in spite of national regulations prohibiting them indicating a failure of monitoring and/or enforcement. Where countries have enacted instruments to prohibit health and other claims in infant formula advertising, the marketing of infant formula must be actively monitored to be effective. © 2016 John Wiley & Sons Ltd.
Multi-stage decoding of multi-level modulation codes
NASA Technical Reports Server (NTRS)
Lin, Shu; Kasami, Tadao; Costello, Daniel J., Jr.
1991-01-01
Various types of multi-stage decoding for multi-level modulation codes are investigated. It is shown that if the component codes of a multi-level modulation code and types of decoding at various stages are chosen properly, high spectral efficiency and large coding gain can be achieved with reduced decoding complexity. Particularly, it is shown that the difference in performance between the suboptimum multi-stage soft-decision maximum likelihood decoding of a modulation code and the single-stage optimum soft-decision decoding of the code is very small, only a fraction of dB loss in signal to noise ratio at a bit error rate (BER) of 10(exp -6).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duwel, D; Lamba, M; Elson, H
Purpose: Various cancers of the eye are successfully treated with radiotherapy utilizing one anterior-posterior (A/P) beam that encompasses the entire content of the orbit. In such cases, a hanging lens shield can be used to spare dose to the radiosensitive lens of the eye to prevent cataracts. Methods: This research focused on Monte Carlo characterization of dose distributions resulting from a single A-P field to the orbit with a hanging shield in place. Monte Carlo codes were developed which calculated dose distributions for various electron radiation energies, hanging lens shield radii, shield heights above the eye, and beam spoiler configurations.more » Film dosimetry was used to benchmark the coding to ensure it was calculating relative dose accurately. Results: The Monte Carlo dose calculations indicated that lateral and depth dose profiles are insensitive to changes in shield height and electron beam energy. Dose deposition was sensitive to shield radius and beam spoiler composition and height above the eye. Conclusion: The use of a single A/P electron beam to treat cancers of the eye while maintaining adequate lens sparing is feasible. Shield radius should be customized to have the same radius as the patient’s lens. A beam spoiler should be used if it is desired to substantially dose the eye tissues lying posterior to the lens in the shadow of the lens shield. The compromise between lens sparing and dose to diseased tissues surrounding the lens can be modulated by varying the beam spoiler thickness, spoiler material composition, and spoiler height above the eye. The sparing ratio is a metric that can be used to evaluate the compromise between lens sparing and dose to surrounding tissues. The higher the ratio, the more dose received by the tissues immediately posterior to the lens relative to the dose received by the lens.« less
Fu, Cheng-Jie; Sheikh, Sanea; Miao, Wei; Andersson, Siv G E; Baldauf, Sandra L
2014-08-21
Discoba (Excavata) is an ancient group of eukaryotes with great morphological and ecological diversity. Unlike the other major divisions of Discoba (Jakobida and Euglenozoa), little is known about the mitochondrial DNAs (mtDNAs) of Heterolobosea. We have assembled a complete mtDNA genome from the aggregating heterolobosean amoeba, Acrasis kona, which consists of a single circular highly AT-rich (83.3%) molecule of 51.5 kb. Unexpectedly, A. kona mtDNA is missing roughly 40% of the protein-coding genes and nearly half of the transfer RNAs found in the only other sequenced heterolobosean mtDNAs, those of Naegleria spp. Instead, over a quarter of A. kona mtDNA consists of novel open reading frames. Eleven of the 16 protein-coding genes missing from A. kona mtDNA were identified in its nuclear DNA and polyA RNA, and phylogenetic analyses indicate that at least 10 of these 11 putative nuclear-encoded mitochondrial (NcMt) proteins arose by direct transfer from the mitochondrion. Acrasis kona mtDNA also employs C-to-U type RNA editing, and 12 homologs of DYW-type pentatricopeptide repeat (PPR) proteins implicated in plant organellar RNA editing are found in A. kona nuclear DNA. A mapping of mitochondrial gene content onto a consensus phylogeny reveals a sporadic pattern of relative stasis and rampant gene loss in Discoba. Rampant loss occurred independently in the unique common lineage leading to Heterolobosea + Tsukubamonadida and later in the unique lineage leading to Acrasis. Meanwhile, mtDNA gene content appears to be remarkably stable in the Acrasis sister lineage leading to Naegleria and in their distant relatives Jakobida. © The Author(s) 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
NASA Astrophysics Data System (ADS)
Niederau, Jan; Wellmann, Florian; Maersch, Jannik; Urai, Janos
2017-04-01
Programming is increasingly recognised an important skill for geoscientists - however, the hurdle to jump into programming for students with little or no experience can be high. We present here teaching concepts on the basis of Jupyter notebooks that combine, in an intuitive way, formatted instruction text with code cells in a single environment. This integration allows for an exposure to programming on several levels: from a complete interactive presentation of content, where students require no or very limited programming experience, to highly complex geoscientific computations. We consider these notebooks therefore as an ideal medium to present computational content to students in the field of geosciences. We show here how we use these notebooks to develop digital documents in Python for undergrad-students, who can then learn about basic concepts in structural geology via self-assessment. Such notebooks comprise concepts such as: stress tensor, strain ellipse, or the mohr circle. Students can interactively change parameters, e.g. by using sliders and immediately see the results. They can further experiment and extend the notebook by writing their own code within the notebook. Jupyter Notebooks for teaching purposes can be provided ready-to-use via online services. That is, students do not need to install additional software on their devices in order to work with the notebooks. We also use Jupyter Notebooks for automatic grading of programming assignments in multiple lectures. An implemented workflow facilitates the generation, distribution of assignments, as well as the final grading. Compared to previous grading methods with a high percentage of repetitive manual grading, the implemented workflow proves to be much more time efficient.
Yao, Jie; Yang, Hong; Dai, Renhuai
2017-10-01
Acanthoscelides obtectus is a common species of the subfamily Bruchinae and a worldwide-distributed seed-feeding beetle. The complete mitochondrial genome of A. obtectus is 16,130 bp in length with an A + T content of 76.4%. It contains a positive AT skew and a negative GC skew. The mitogenome of A. obtectus contains 13 protein-coding genes (PCGs), 22 tRNA genes, two rRNA genes and a non-coding region (D-loop). All PCGs start with an ATN codon, and seven (ND3, ATP6, COIII, ND3, ND4L, ND6, and Cytb) of them terminate with TAA, while the remaining five (COI, COII, ND1, ND4, and ND5) terminate with a single T, ATP8 terminates with TGA. Except tRNA Ser , the secondary structures of 21 tRNAs that can be folded into a typical clover-leaf structure were identified. The secondary structures of lrRNA and srRNA were also predicted in this study. There are six domains with 48 helices in lrRNA and three domains with 32 helices in srRNA. The control region of A. obtectus is 1354 bp in size with the highest A + T content (83.5%) in a mitochondrial gene. Thirteen PCGs in 19 species have been used to infer their phylogenetic relationships. Our results show that A. obtectus belongs to the family Chrysomelidae (subfamily-Bruchinae). This is the first study on phylogenetic analyses involving the mitochondrial genes of A. obtectus and could provide basic data for future studies of mitochondrial genome diversities and the evolution of related insect lineages.
2 CFR 1.100 - Content of this title.
Code of Federal Regulations, 2012 CFR
2009-01-01
... 2 Grants and Agreements 1 2009-01-01 2009-01-01 false Content of this title. 1.100 Section 1.100 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.100 Content of this title. This title contains— (a) Office...
2 CFR 1.100 - Content of this title.
Code of Federal Regulations, 2011 CFR
2008-01-01
... 2 Grants and Agreements 1 2008-01-01 2008-01-01 false Content of this title. 1.100 Section 1.100 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.100 Content of this title. This title contains— (a) Office...
2 CFR 1.100 - Content of this title.
Code of Federal Regulations, 2011 CFR
2005-01-01
... 2 Grants and Agreements 1 2005-01-01 2005-01-01 false Content of this title. 1.100 Section 1.100 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.100 Content of this title. This title contains— (a) Office...
2 CFR 1.100 - Content of this title.
Code of Federal Regulations, 2013 CFR
2006-01-01
... 2 Grants and Agreements 1 2006-01-01 2006-01-01 false Content of this title. 1.100 Section 1.100 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.100 Content of this title. This title contains— (a) Office...
2 CFR 1.100 - Content of this title.
Code of Federal Regulations, 2013 CFR
2007-01-01
... 2 Grants and Agreements 1 2007-01-01 2007-01-01 false Content of this title. 1.100 Section 1.100 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.100 Content of this title. This title contains— (a) Office...
VLSI single-chip (255,223) Reed-Solomon encoder with interleaver
NASA Technical Reports Server (NTRS)
Hsu, In-Shek (Inventor); Deutsch, Leslie J. (Inventor); Truong, Trieu-Kie (Inventor); Reed, Irving S. (Inventor)
1990-01-01
The invention relates to a concatenated Reed-Solomon/convolutional encoding system consisting of a Reed-Solomon outer code and a convolutional inner code for downlink telemetry in space missions, and more particularly to a Reed-Solomon encoder with programmable interleaving of the information symbols and code correction symbols to combat error bursts in the Viterbi decoder.
TACOM LCMC IB and DMSMS Mitigation
2011-09-26
Sources I Gosed II Opened ~ I AAC flag: Vii6d AI CAGE codes (CONUS): 3 3 CAGE codes (OCONUS): 0 0 ---- Total: 3 3 Single or no CAGE code...v In box - I’lL- I Qi) chambers:... I ~ Microsoft - I I~ AADO SER.- t@) ~ i_ .... gose I I Used On Reference/Part Numbers I~ 26SEP11
Parallel Subspace Subcodes of Reed-Solomon Codes for Magnetic Recording Channels
ERIC Educational Resources Information Center
Wang, Han
2010-01-01
Read channel architectures based on a single low-density parity-check (LDPC) code are being considered for the next generation of hard disk drives. However, LDPC-only solutions suffer from the error floor problem, which may compromise reliability, if not handled properly. Concatenated architectures using an LDPC code plus a Reed-Solomon (RS) code…
Coding Manual for Continuous Observation of Interactions by Single Subjects in an Academic Setting.
ERIC Educational Resources Information Center
Cobb, Joseph A.; Hops, Hyman
The manual, designed particularly for work with acting-out or behavior problem students, describes coding procedures used in the observation of continuous classroom interactions between the student and his peers and teacher. Peer and/or teacher behaviors antecedent and consequent to the subject's behavior are identified in the coding process,…
HEVC for high dynamic range services
NASA Astrophysics Data System (ADS)
Kim, Seung-Hwan; Zhao, Jie; Misra, Kiran; Segall, Andrew
2015-09-01
Displays capable of showing a greater range of luminance values can render content containing high dynamic range information in a way such that the viewers have a more immersive experience. This paper introduces the design aspects of a high dynamic range (HDR) system, and examines the performance of the HDR processing chain in terms of compression efficiency. Specifically it examines the relation between recently introduced Society of Motion Picture and Television Engineers (SMPTE) ST 2084 transfer function and the High Efficiency Video Coding (HEVC) standard. SMPTE ST 2084 is designed to cover the full range of an HDR signal from 0 to 10,000 nits, however in many situations the valid signal range of actual video might be smaller than SMPTE ST 2084 supported range. The above restricted signal range results in restricted range of code values for input video data and adversely impacts compression efficiency. In this paper, we propose a code value remapping method that extends the restricted range code values into the full range code values so that the existing standards such as HEVC may better compress the video content. The paper also identifies related non-normative encoder-only changes that are required for remapping method for a fair comparison with anchor. Results are presented comparing the efficiency of the current approach versus the proposed remapping method for HM-16.2.
An Idealized, Single Radial Swirler, Lean-Direct-Injection (LDI) Concept Meshing Script
NASA Technical Reports Server (NTRS)
Iannetti, Anthony C.; Thompson, Daniel
2008-01-01
To easily study combustor design parameters using computational fluid dynamics codes (CFD), a Gridgen Glyph-based macro (based on the Tcl scripting language) dubbed BladeMaker has been developed for the meshing of an idealized, single radial swirler, lean-direct-injection (LDI) combustor. BladeMaker is capable of taking in a number of parameters, such as blade width, blade tilt with respect to the perpendicular, swirler cup radius, and grid densities, and producing a three-dimensional meshed radial swirler with a can-annular (canned) combustor. This complex script produces a data format suitable for but not specific to the National Combustion Code (NCC), a state-of-the-art CFD code developed for reacting flow processes.
Comparison of computer codes for calculating dynamic loads in wind turbines
NASA Technical Reports Server (NTRS)
Spera, D. A.
1977-01-01
Seven computer codes for analyzing performance and loads in large, horizontal axis wind turbines were used to calculate blade bending moment loads for two operational conditions of the 100 kW Mod-0 wind turbine. Results were compared with test data on the basis of cyclic loads, peak loads, and harmonic contents. Four of the seven codes include rotor-tower interaction and three were limited to rotor analysis. With a few exceptions, all calculated loads were within 25 percent of nominal test data.
Codon influence on protein expression in E. coli correlates with mRNA levels
Boël, Grégory; Wong, Kam-Ho; Su, Min; Luff, Jon; Valecha, Mayank; Everett, John K.; Acton, Thomas B.; Xiao, Rong; Montelione, Gaetano T.; Aalberts, Daniel P.; Hunt, John F.
2016-01-01
Degeneracy in the genetic code, which enables a single protein to be encoded by a multitude of synonymous gene sequences, has an important role in regulating protein expression, but substantial uncertainty exists concerning the details of this phenomenon. Here we analyze the sequence features influencing protein expression levels in 6,348 experiments using bacteriophage T7 polymerase to synthesize messenger RNA in Escherichia coli. Logistic regression yields a new codon-influence metric that correlates only weakly with genomic codon-usage frequency, but strongly with global physiological protein concentrations and also mRNA concentrations and lifetimes in vivo. Overall, the codon content influences protein expression more strongly than mRNA-folding parameters, although the latter dominate in the initial ~16 codons. Genes redesigned based on our analyses are transcribed with unaltered efficiency but translated with higher efficiency in vitro. The less efficiently translated native sequences show greatly reduced mRNA levels in vivo. Our results suggest that codon content modulates a kinetic competition between protein elongation and mRNA degradation that is a central feature of the physiology and also possibly the regulation of translation in E. coli. PMID:26760206
Taki, M; Signorini, A; Oton, C J; Nannipieri, T; Di Pasquale, F
2013-10-15
We experimentally demonstrate the use of cyclic pulse coding for distributed strain and temperature measurements in hybrid Raman/Brillouin optical time-domain analysis (BOTDA) optical fiber sensors. The highly integrated proposed solution effectively addresses the strain/temperature cross-sensitivity issue affecting standard BOTDA sensors, allowing for simultaneous meter-scale strain and temperature measurements over 10 km of standard single mode fiber using a single narrowband laser source only.
Nonlinear wave vacillation in the atmosphere
NASA Technical Reports Server (NTRS)
Antar, Basil N.
1987-01-01
The problem of vacillation in a baroclinically unstable flow field is studied through the time evolution of a single nonlinearly unstable wave. To this end a computer code is being developed to solve numerically for the time evolution of the amplitude of such a wave. The final working code will be the end product resulting from the development of a heirarchy of codes with increasing complexity. The first code in this series was completed and is undergoing several diagnostic analyses to verify its validity. The development of this code is detailed.
Autosophy information theory provides lossless data and video compression based on the data content
NASA Astrophysics Data System (ADS)
Holtz, Klaus E.; Holtz, Eric S.; Holtz, Diana
1996-09-01
A new autosophy information theory provides an alternative to the classical Shannon information theory. Using the new theory in communication networks provides both a high degree of lossless compression and virtually unbreakable encryption codes for network security. The bandwidth in a conventional Shannon communication is determined only by the data volume and the hardware parameters, such as image size; resolution; or frame rates in television. The data content, or what is shown on the screen, is irrelevant. In contrast, the bandwidth in autosophy communication is determined only by data content, such as novelty and movement in television images. It is the data volume and hardware parameters that become irrelevant. Basically, the new communication methods use prior 'knowledge' of the data, stored in a library, to encode subsequent transmissions. The more 'knowledge' stored in the libraries, the higher the potential compression ratio. 'Information' is redefined as that which is not already known by the receiver. Everything already known is redundant and need not be re-transmitted. In a perfect communication each transmission code, called a 'tip,' creates a new 'engram' of knowledge in the library in which each tip transmission can represent any amount of data. Autosophy theories provide six separate learning modes, or omni dimensional networks, all of which can be used for data compression. The new information theory reveals the theoretical flaws of other data compression methods, including: the Huffman; Ziv Lempel; LZW codes and commercial compression codes such as V.42bis and MPEG-2.
Evolution of plastic anisotropy for high-strain-rate computations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiferl, S.K.; Maudlin, P.J.
1994-12-01
A model for anisotropic material strength, and for changes in the anisotropy due to plastic strain, is described. This model has been developed for use in high-rate, explicit, Lagrangian multidimensional continuum-mechanics codes. The model handles anisotropies in single-phase materials, in particular the anisotropies due to crystallographic texture--preferred orientations of the single-crystal grains. Textural anisotropies, and the changes in these anisotropies, depend overwhelmingly no the crystal structure of the material and on the deformation history. The changes, particularly for a complex deformations, are not amenable to simple analytical forms. To handle this problem, the material model described here includes a texturemore » code, or micromechanical calculation, coupled to a continuum code. The texture code updates grain orientations as a function of tensor plastic strain, and calculates the yield strength in different directions. A yield function is fitted to these yield points. For each computational cell in the continuum simulation, the texture code tracks a particular set of grain orientations. The orientations will change due to the tensor strain history, and the yield function will change accordingly. Hence, the continuum code supplies a tensor strain to the texture code, and the texture code supplies an updated yield function to the continuum code. Since significant texture changes require relatively large strains--typically, a few percent or more--the texture code is not called very often, and the increase in computer time is not excessive. The model was implemented, using a finite-element continuum code and a texture code specialized for hexagonal-close-packed crystal structures. The results for several uniaxial stress problems and an explosive-forming problem are shown.« less
What Information is Stored in DNA: Does it Contain Digital Error Correcting Codes?
NASA Astrophysics Data System (ADS)
Liebovitch, Larry
1998-03-01
The longest term correlations in living systems are the information stored in DNA which reflects the evolutionary history of an organism. The 4 bases (A,T,G,C) encode sequences of amino acids as well as locations of binding sites for proteins that regulate DNA. The fidelity of this important information is maintained by ANALOG error check mechanisms. When a single strand of DNA is replicated the complementary base is inserted in the new strand. Sometimes the wrong base is inserted that sticks out disrupting the phosphate backbone. The new base is not yet methylated, so repair enzymes, that slide along the DNA, can tear out the wrong base and replace it with the right one. The bases in DNA form a sequence of 4 different symbols and so the information is encoded in a DIGITAL form. All the digital codes in our society (ISBN book numbers, UPC product codes, bank account numbers, airline ticket numbers) use error checking code, where some digits are functions of other digits to maintain the fidelity of transmitted informaiton. Does DNA also utitlize a DIGITAL error chekcing code to maintain the fidelity of its information and increase the accuracy of replication? That is, are some bases in DNA functions of other bases upstream or downstream? This raises the interesting mathematical problem: How does one determine whether some symbols in a sequence of symbols are a function of other symbols. It also bears on the issue of determining algorithmic complexity: What is the function that generates the shortest algorithm for reproducing the symbol sequence. The error checking codes most used in our technology are linear block codes. We developed an efficient method to test for the presence of such codes in DNA. We coded the 4 bases as (0,1,2,3) and used Gaussian elimination, modified for modulus 4, to test if some bases are linear combinations of other bases. We used this method to analyze the base sequence in the genes from the lac operon and cytochrome C. We did not find evidence for such error correcting codes in these genes. However, we analyzed only a small amount of DNA and if digitial error correcting schemes are present in DNA, they may be more subtle than such simple linear block codes. The basic issue we raise here, is how information is stored in DNA and an appreciation that digital symbol sequences, such as DNA, admit of interesting schemes to store and protect the fidelity of their information content. Liebovitch, Tao, Todorov, Levine. 1996. Biophys. J. 71:1539-1544. Supported by NIH grant EY6234.
Social Work Science and Knowledge Utilization
ERIC Educational Resources Information Center
Marsh, Jeanne C.; Reed, Martena
2016-01-01
Objective: This article advances understanding of social work science by examining the content and methods of highly utilized or cited journal articles in social work. Methods: A data base of the 100 most frequently cited articles from 79 social work journals was coded and categorized into three primary domains: content, research versus…
Minnesota Academic Standards: Kindergarten
ERIC Educational Resources Information Center
Minnesota Department of Education, 2017
2017-01-01
This document contains all of the Minnesota kindergarten academic standards in the content areas of Arts, English Language Arts, Mathematics, Science and Social Studies. For each content area there is a short overview followed by a coding diagram of how the standards are organized and displayed. This document is adapted from the official versions…
Map Feature Content and Text Recall of Good and Poor Readers.
ERIC Educational Resources Information Center
Amlund, Jeanne T.; And Others
1985-01-01
Reports two experiments evaluating the effect of map feature content on text recall by subjects of varying reading skill levels. Finds that both experiments support the conjoint retention hypothesis, in which dual-coding of spatial and verbal information and their interaction in memory enhance recall. (MM)
7 CFR 1485.13 - Application process and strategic plan.
Code of Federal Regulations, 2010 CFR
2010-01-01
... affiliated organizations; (D) A description of management and administrative capability; (E) A description of... code and the percentage of U.S. origin content by weight, exclusive of added water; (B) A description... and the percentage of U.S. origin content by weight, exclusive of added water; (C) A description of...
Race Socialization Messages across Historical Time
ERIC Educational Resources Information Center
Brown, Tony N.; Lesane-Brown, Chase L.
2006-01-01
In this study we investigated whether the content of race socialization messages varied by birth cohort, using data from a national probability sample. Most respondents recalled receiving messages about what it means to be black from their parents or guardians; these messages were coded into five mutually exclusive content categories: individual…
Vinje, Kristine Hansen; Phan, Linh Thi Hong; Nguyen, Tuan Thanh; Henjum, Sigrun; Ribe, Lovise Omoijuanfo; Mathisen, Roger
2017-06-01
To review regulations and to perform a media audit of promotion of products under the scope of the International Code of Marketing of Breast-milk Substitutes ('the Code') in South-East Asia. We reviewed national regulations relating to the Code and 800 clips of editorial content, 387 advertisements and 217 Facebook posts from January 2015 to January 2016. We explored the ecological association between regulations and market size, and between the number of advertisements and market size and growth of milk formula. Cambodia, Indonesia, Myanmar, Thailand and Vietnam. Regulations on the child's age for inappropriate marketing of products are all below the Code's updated recommendation of 36 months (i.e. 12 months in Thailand and Indonesia; 24 months in the other three countries) and are voluntary in Thailand. Although the advertisements complied with the national regulations on the age limit, they had content (e.g. stages of milk formula; messages about the benefit; pictures of a child) that confused audiences. Market size and growth of milk formula were positively associated with the number of newborns and the number of advertisements, and were not affected by the current level of implementation of breast-milk substitute laws and regulations. The present media audit reveals inappropriate promotion and insufficient national regulation of products under the scope of the Code in South-East Asia. Strengthened implementation of regulations aligned with the Code's updated recommendation should be part of comprehensive strategies to minimize the harmful effects of advertisements of breast-milk substitutes on maternal and child nutrition and health.
User's manual for the FLORA equilibrium and stability code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freis, R.P.; Cohen, B.I.
1985-04-01
This document provides a user's guide to the content and use of the two-dimensional axisymmetric equilibrium and stability code FLORA. FLORA addresses the low-frequency MHD stability of long-thin axisymmetric tandem mirror systems with finite pressure and finite-larmor-radius effects. FLORA solves an initial-value problem for interchange, rotational, and ballooning stability.
Square One TV: Coding of Segments.
ERIC Educational Resources Information Center
McNeal, Betsy; Singer, Karen
This report describes the system used to code each segment of Square One TV for content analysis of all four seasons of production. The analysis is intended to aid in the assessment of how well Square One is meeting its three goals: (1) to promote positive attitudes toward, and enthusiasm for, mathematics; (2) to encourage the use and application…
Codes, Costs, and Critiques: The Organization of Information in "Library Quarterly", 1931-2004
ERIC Educational Resources Information Center
Olson, Hope A.
2006-01-01
This article reports the results of a quantitative and thematic content analysis of the organization of information literature in the "Library Quarterly" ("LQ") between its inception in 1931 and 2004. The majority of articles in this category were published in the first half of "LQ's" run. Prominent themes have included cataloging codes and the…
Dual Coding Theory, Word Abstractness, and Emotion: A Critical Review of Kousta et al. (2011)
ERIC Educational Resources Information Center
Paivio, Allan
2013-01-01
Kousta, Vigliocco, Del Campo, Vinson, and Andrews (2011) questioned the adequacy of dual coding theory and the context availability model as explanations of representational and processing differences between concrete and abstract words. They proposed an alternative approach that focuses on the role of emotional content in the processing of…
ERIC Educational Resources Information Center
Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.
2012-01-01
Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…
Development of battering ram vibrator system
NASA Astrophysics Data System (ADS)
Sun, F.; Chen, Z.; Lin, J.; Tong, X.
2012-12-01
This paper researched the battering ram vibrator system, by electric machinery we can control oil system of battering ram, we realized exact control of battering ram, after analyzed pseudorandom coding, code "0" and "1" correspond to rest and shake of battering ram, then we can get pseudorandom coding which is the same with battering ram vibrator. After testing , by the reference trace and single shot record, when we using pseudorandom coding mode, the ratio of seismic wavelet to correlation interfere is about 68 dB, while the general mode , the ratio of seismic wavelet to correlation interfere only is 27.9dB, by battering ram vibrator system, we can debase the correlation interfere which come from the single shaking frequency of battering ram, this system advanced the signal-to-noise ratio of seismic data, which can give direction of the application of battering ram vibrator in metal mine exploration and high resolving seismic exploration.
Sandia Simple Particle Tracking (Sandia SPT) v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anthony, Stephen M.
2015-06-15
Sandia SPT is designed as software to accompany a book chapter being published a methods chapter which provides an introduction on how to label and track individual proteins. The Sandia Simple Particle Tracking code uses techniques common to the image processing community, where its value is that it facilitates implementing the methods described in the book chapter by providing the necessary open-source code. The code performs single particle spot detection (or segmentation and localization) followed by tracking (or connecting the detected particles into trajectories). The book chapter, which along with the headers in each file, constitutes the documentation for themore » code is: Anthony, S.M.; Carroll-Portillo, A.; Timlon, J.A., Dynamics and Interactions of Individual Proteins in the Membrane of Living Cells. In Anup K. Singh (Ed.) Single Cell Protein Analysis Methods in Molecular Biology. Springer« less
Death of a dogma: eukaryotic mRNAs can code for more than one protein
Mouilleron, Hélène; Delcourt, Vivian; Roucou, Xavier
2016-01-01
mRNAs carry the genetic information that is translated by ribosomes. The traditional view of a mature eukaryotic mRNA is a molecule with three main regions, the 5′ UTR, the protein coding open reading frame (ORF) or coding sequence (CDS), and the 3′ UTR. This concept assumes that ribosomes translate one ORF only, generally the longest one, and produce one protein. As a result, in the early days of genomics and bioinformatics, one CDS was associated with each protein-coding gene. This fundamental concept of a single CDS is being challenged by increasing experimental evidence indicating that annotated proteins are not the only proteins translated from mRNAs. In particular, mass spectrometry (MS)-based proteomics and ribosome profiling have detected productive translation of alternative open reading frames. In several cases, the alternative and annotated proteins interact. Thus, the expression of two or more proteins translated from the same mRNA may offer a mechanism to ensure the co-expression of proteins which have functional interactions. Translational mechanisms already described in eukaryotic cells indicate that the cellular machinery is able to translate different CDSs from a single viral or cellular mRNA. In addition to summarizing data showing that the protein coding potential of eukaryotic mRNAs has been underestimated, this review aims to challenge the single translated CDS dogma. PMID:26578573
Porting plasma physics simulation codes to modern computing architectures using the
NASA Astrophysics Data System (ADS)
Germaschewski, Kai; Abbott, Stephen
2015-11-01
Available computing power has continued to grow exponentially even after single-core performance satured in the last decade. The increase has since been driven by more parallelism, both using more cores and having more parallelism in each core, e.g. in GPUs and Intel Xeon Phi. Adapting existing plasma physics codes is challenging, in particular as there is no single programming model that covers current and future architectures. We will introduce the open-source
Feasibility of coded vibration in a vibro-ultrasound system for tissue elasticity measurement.
Zhao, Jinxin; Wang, Yuanyuan; Yu, Jinhua; Li, Tianjie; Zheng, Yong-Ping
2016-07-01
The ability of various methods for elasticity measurement and imaging is hampered by the vibration amplitude on biological tissues. Based on the inference that coded excitation will improve the performance of the cross-correlation function of the tissue displacement waves, the idea of exerting encoded external vibration on tested samples for measuring its elasticity is proposed. It was implemented by integrating a programmable vibration generation function into a customized vibro-ultrasound system to generate Barker coded vibration for elasticity measurement. Experiments were conducted on silicone phantoms and porcine muscles. The results showed that coded excitation of the vibration enhanced the accuracy and robustness of the elasticity measurement especially in low signal-to-noise ratio scenarios. In the phantom study, the measured shear modulus values with coded vibration had an R(2 )= 0.993 linear correlation to that of referenced indentation, while for single-cycle pulse the R(2) decreased to 0.987. In porcine muscle study, the coded vibration also obtained a shear modulus value which is more accurate than the single-cycle pulse by 0.16 kPa and 0.33 kPa at two different depths. These results demonstrated the feasibility and potentiality of the coded vibration for enhancing the quality of elasticity measurement and imaging.
2 CFR § 1.100 - Content of this title.
Code of Federal Regulations, 2012 CFR
2017-01-01
... 2 Grants and Agreements 1 2017-01-01 2017-01-01 false Content of this title. § 1.100 Section § 1.100 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.100 Content of this title. This title contains— (a)...
2 CFR § 1.100 - Content of this title.
Code of Federal Regulations, 2013 CFR
2016-01-01
... 2 Grants and Agreements 1 2016-01-01 2016-01-01 false Content of this title. § 1.100 Section § 1.100 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.100 Content of this title. This title contains— (a)...
2 CFR § 1.100 - Content of this title.
Code of Federal Regulations, 2010 CFR
2018-01-01
... 2 Grants and Agreements 1 2018-01-01 2018-01-01 false Content of this title. § 1.100 Section § 1.100 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.100 Content of this title. This title contains— (a)...
2 CFR § 1.100 - Content of this title.
Code of Federal Regulations, 2011 CFR
2015-01-01
... 2 Grants and Agreements 1 2015-01-01 2015-01-01 false Content of this title. § 1.100 Section § 1.100 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1.100 Content of this title. This title contains— (a)...
Batshon, Hussam G; Djordjevic, Ivan; Schmidt, Ted
2010-09-13
We propose a subcarrier-multiplexed four-dimensional LDPC bit-interleaved coded modulation scheme that is capable of achieving beyond 480 Gb/s single-channel transmission rate over optical channels. Subcarrier-multiplexed four-dimensional LDPC coded modulation scheme outperforms the corresponding dual polarization schemes by up to 4.6 dB in OSNR at BER 10(-8).
NASA Technical Reports Server (NTRS)
Lou, John; Ferraro, Robert; Farrara, John; Mechoso, Carlos
1996-01-01
An analysis is presented of several factors influencing the performance of a parallel implementation of the UCLA atmospheric general circulation model (AGCM) on massively parallel computer systems. Several modificaitons to the original parallel AGCM code aimed at improving its numerical efficiency, interprocessor communication cost, load-balance and issues affecting single-node code performance are discussed.
Two-fluid 2.5D code for simulations of small scale magnetic fields in the lower solar atmosphere
NASA Astrophysics Data System (ADS)
Piantschitsch, Isabell; Amerstorfer, Ute; Thalmann, Julia Katharina; Hanslmeier, Arnold; Lemmerer, Birgit
2015-08-01
Our aim is to investigate magnetic reconnection as a result of the time evolution of magnetic flux tubes in the solar chromosphere. A new numerical two-fluid code was developed, which will perform a 2.5D simulation of the dynamics from the upper convection zone up to the transition region. The code is based on the Total Variation Diminishing Lax-Friedrichs method and includes the effects of ion-neutral collisions, ionisation/recombination, thermal/resistive diffusivity as well as collisional/resistive heating. What is innovative about our newly developed code is the inclusion of a two-fluid model in combination with the use of analytically constructed vertically open magnetic flux tubes, which are used as initial conditions for our simulation. First magnetohydrodynamic (MHD) tests have already shown good agreement with known results of numerical MHD test problems like e.g. the Orszag-Tang vortex test, the Current Sheet test or the Spherical Blast Wave test. Furthermore, the single-fluid approach will also be applied to the initial conditions, in order to compare the different rates of magnetic reconnection in both codes, the two-fluid code and the single-fluid one.
Fault-tolerant conversion between adjacent Reed-Muller quantum codes based on gauge fixing
NASA Astrophysics Data System (ADS)
Quan, Dong-Xiao; Zhu, Li-Li; Pei, Chang-Xing; Sanders, Barry C.
2018-03-01
We design forward and backward fault-tolerant conversion circuits, which convert between the Steane code and the 15-qubit Reed-Muller quantum code so as to provide a universal transversal gate set. In our method, only seven out of a total 14 code stabilizers need to be measured, and we further enhance the circuit by simplifying some stabilizers; thus, we need only to measure eight weight-4 stabilizers for one round of forward conversion and seven weight-4 stabilizers for one round of backward conversion. For conversion, we treat random single-qubit errors and their influence on syndromes of gauge operators, and our novel single-step process enables more efficient fault-tolerant conversion between these two codes. We make our method quite general by showing how to convert between any two adjacent Reed-Muller quantum codes \\overline{\\textsf{RM}}(1,m) and \\overline{\\textsf{RM}}≤ft(1,m+1\\right) , for which we need only measure stabilizers whose number scales linearly with m rather than exponentially with m obtained in previous work. We provide the explicit mathematical expression for the necessary stabilizers and the concomitant resources required.
Cranwell, Jo; Britton, John; Bains, Manpreet
2017-02-01
The purpose of the present study is to describe the portrayal of alcohol content in popular YouTube music videos. We used inductive thematic analysis to explore the lyrics and visual imagery in 49 UK Top 40 songs and music videos previously found to contain alcohol content and watched by many British adolescents aged between 11 and 18 years and to examine if branded content contravened alcohol industry advertising codes of practice. The analysis generated three themes. First, alcohol content was associated with sexualised imagery or lyrics and the objectification of women. Second, alcohol was associated with image, lifestyle and sociability. Finally, some videos showed alcohol overtly encouraging excessive drinking and drunkenness, including those containing branding, with no negative consequences to the drinker. Our results suggest that YouTube music videos promote positive associations with alcohol use. Further, several alcohol companies adopt marketing strategies in the video medium that are entirely inconsistent with their own or others agreed advertising codes of practice. We conclude that, as a harm reduction measure, policies should change to prevent adolescent exposure to the positive promotion of alcohol and alcohol branding in music videos.
Document image retrieval through word shape coding.
Lu, Shijian; Li, Linlin; Tan, Chew Lim
2008-11-01
This paper presents a document retrieval technique that is capable of searching document images without OCR (optical character recognition). The proposed technique retrieves document images by a new word shape coding scheme, which captures the document content through annotating each word image by a word shape code. In particular, we annotate word images by using a set of topological shape features including character ascenders/descenders, character holes, and character water reservoirs. With the annotated word shape codes, document images can be retrieved by either query keywords or a query document image. Experimental results show that the proposed document image retrieval technique is fast, efficient, and tolerant to various types of document degradation.
Ortwein, Heiderose; Benz, Alexander; Carl, Petra; Huwendiek, Sören; Pander, Tanja; Kiessling, Claudia
2017-02-01
To investigate whether the Verona Coding Definitions of Emotional Sequences to code health providers' responses (VR-CoDES-P) can be used for assessment of medical students' responses to patients' cues and concerns provided in written case vignettes. Student responses in direct speech to patient cues and concerns were analysed in 21 different case scenarios using VR-CoDES-P. A total of 977 student responses were available for coding, and 857 responses were codable with the VR-CoDES-P. In 74.6% of responses, the students used either a "reducing space" statement only or a "providing space" statement immediately followed by a "reducing space" statement. Overall, the most frequent response was explicit information advice (ERIa) followed by content exploring (EPCEx) and content acknowledgement (EPCAc). VR-CoDES-P were applicable to written responses of medical students when they were phrased in direct speech. The application of VR-CoDES-P is reliable and feasible when using the differentiation of "providing" and "reducing space" responses. Communication strategies described by students in non-direct speech were difficult to code and produced many missings. VR-CoDES-P are useful for analysis of medical students' written responses when focusing on emotional issues. Students need precise instructions for their response in the given test format. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Draft genome sequence of ramie, Boehmeria nivea (L.) Gaudich.
Luan, Ming-Bao; Jian, Jian-Bo; Chen, Ping; Chen, Jun-Hui; Chen, Jian-Hua; Gao, Qiang; Gao, Gang; Zhou, Ju-Hong; Chen, Kun-Mei; Guang, Xuan-Min; Chen, Ji-Kang; Zhang, Qian-Qian; Wang, Xiao-Fei; Fang, Long; Sun, Zhi-Min; Bai, Ming-Zhou; Fang, Xiao-Dong; Zhao, Shan-Cen; Xiong, He-Ping; Yu, Chun-Ming; Zhu, Ai-Guo
2018-05-01
Ramie, Boehmeria nivea (L.) Gaudich, family Urticaceae, is a plant native to eastern Asia, and one of the world's oldest fibre crops. It is also used as animal feed and for the phytoremediation of heavy metal-contaminated farmlands. Thus, the genome sequence of ramie was determined to explore the molecular basis of its fibre quality, protein content and phytoremediation. For further understanding ramie genome, different paired-end and mate-pair libraries were combined to generate 134.31 Gb of raw DNA sequences using the Illumina whole-genome shotgun sequencing approach. The highly heterozygous B. nivea genome was assembled using the Platanus Genome Assembler, which is an effective tool for the assembly of highly heterozygous genome sequences. The final length of the draft genome of this species was approximately 341.9 Mb (contig N50 = 22.62 kb, scaffold N50 = 1,126.36 kb). Based on ramie genome annotations, 30,237 protein-coding genes were predicted, and the repetitive element content was 46.3%. The completeness of the final assembly was evaluated by benchmarking universal single-copy orthologous genes (BUSCO); 90.5% of the 1,440 expected embryophytic genes were identified as complete, and 4.9% were identified as fragmented. Phylogenetic analysis based on single-copy gene families and one-to-one orthologous genes placed ramie with mulberry and cannabis, within the clade of urticalean rosids. Genome information of ramie will be a valuable resource for the conservation of endangered Boehmeria species and for future studies on the biogeography and characteristic evolution of members of Urticaceae. © 2018 John Wiley & Sons Ltd.
Laxton, Adrian W; Neimat, Joseph S; Davis, Karen D; Womelsdorf, Thilo; Hutchison, William D; Dostrovsky, Jonathan O; Hamani, Clement; Mayberg, Helen S; Lozano, Andres M
2013-11-15
The subcallosal cingulate and adjacent ventromedial prefrontal cortex (collectively referred to here as the subcallosal cortex or SCC) have been identified as key brain areas in emotional processing. The SCC's role in affective valuation as well as severe mood and motivational disturbances, such as major depression, has been largely inferred from measures of neuronal population activity using functional neuroimaging. On the basis of imaging studies, it is unclear whether the SCC predominantly processes 1) negatively valenced affective content, 2) affective arousal, or 3) category-specific affective information. To clarify these putative functional roles of the SCC, we measured single neuron activity in the SCC of 15 human subjects undergoing deep brain stimulation for depression while they viewed emotionally evocative images grouped into categories that varied in emotional valence (pleasantness) and arousal. We found that the majority of responsive neurons were modulated by specific emotion categories, rather than by valence or arousal alone. Moreover, although these emotion-category-specific neurons responded to both positive and negative emotion categories, a significant majority were selective for negatively valenced emotional content. These findings reveal that single SCC neuron activity reflects the automatic valuational processing and implicit emotion categorization of visual stimuli. Furthermore, because of the predominance of neuronal signals in SCC conveying negative affective valuations and the increased activity in this region among depressed people, the effectiveness of depression therapies that alter SCC neuronal activity may relate to the down-regulation of a previously negative emotional processing bias. © 2013 Society of Biological Psychiatry.
Coupling between a multi-physics workflow engine and an optimization framework
NASA Astrophysics Data System (ADS)
Di Gallo, L.; Reux, C.; Imbeaux, F.; Artaud, J.-F.; Owsiak, M.; Saoutic, B.; Aiello, G.; Bernardi, P.; Ciraolo, G.; Bucalossi, J.; Duchateau, J.-L.; Fausser, C.; Galassi, D.; Hertout, P.; Jaboulay, J.-C.; Li-Puma, A.; Zani, L.
2016-03-01
A generic coupling method between a multi-physics workflow engine and an optimization framework is presented in this paper. The coupling architecture has been developed in order to preserve the integrity of the two frameworks. The objective is to provide the possibility to replace a framework, a workflow or an optimizer by another one without changing the whole coupling procedure or modifying the main content in each framework. The coupling is achieved by using a socket-based communication library for exchanging data between the two frameworks. Among a number of algorithms provided by optimization frameworks, Genetic Algorithms (GAs) have demonstrated their efficiency on single and multiple criteria optimization. Additionally to their robustness, GAs can handle non-valid data which may appear during the optimization. Consequently GAs work on most general cases. A parallelized framework has been developed to reduce the time spent for optimizations and evaluation of large samples. A test has shown a good scaling efficiency of this parallelized framework. This coupling method has been applied to the case of SYCOMORE (SYstem COde for MOdeling tokamak REactor) which is a system code developed in form of a modular workflow for designing magnetic fusion reactors. The coupling of SYCOMORE with the optimization platform URANIE enables design optimization along various figures of merit and constraints.
Superdense Coding over Optical Fiber Links with Complete Bell-State Measurements
NASA Astrophysics Data System (ADS)
Williams, Brian P.; Sadlier, Ronald J.; Humble, Travis S.
2017-02-01
Adopting quantum communication to modern networking requires transmitting quantum information through a fiber-based infrastructure. We report the first demonstration of superdense coding over optical fiber links, taking advantage of a complete Bell-state measurement enabled by time-polarization hyperentanglement, linear optics, and common single-photon detectors. We demonstrate the highest single-qubit channel capacity to date utilizing linear optics, 1.665 ±0.018 , and we provide a full experimental implementation of a hybrid, quantum-classical communication protocol for image transfer.
Social media use by community-based organizations conducting health promotion: a content analysis
2013-01-01
Background Community-based organizations (CBOs) are critical channels for the delivery of health promotion programs. Much of their influence comes from the relationships they have with community members and other key stakeholders and they may be able to harness the power of social media tools to develop and maintain these relationships. There are limited data describing if and how CBOs are using social media. This study assesses the extent to which CBOs engaged in health promotion use popular social media channels, the types of content typically shared, and the extent to which the interactive aspects of social media tools are utilized. Methods We assessed the social media presence and patterns of usage of CBOs engaged in health promotion in Boston, Lawrence, and Worcester, Massachusetts. We coded content on three popular channels: Facebook, Twitter, and YouTube. We used content analysis techniques to quantitatively summarize posts, tweets, and videos on these channels, respectively. For each organization, we coded all content put forth by the CBO on the three channels in a 30-day window. Two coders were trained and conducted the coding. Data were collected between November 2011 and January 2012. Results A total of 166 organizations were included in our census. We found that 42% of organizations used at least one of the channels of interest. Across the three channels, organization promotion was the most common theme for content (66% of posts, 63% of tweets, and 93% of videos included this content). Most organizations updated Facebook and Twitter content at rates close to recommended frequencies. We found limited interaction/engagement with audience members. Conclusions Much of the use of social media tools appeared to be uni-directional, a flow of information from the organization to the audience. By better leveraging opportunities for interaction and user engagement, these organizations can reap greater benefits from the non-trivial investment required to use social media well. Future research should assess links between use patterns and organizational characteristics, staff perspectives, and audience engagement. PMID:24313999
The locus of evolution: evo devo and the genetics of adaptation.
Hoekstra, Hopi E; Coyne, Jerry A
2007-05-01
An important tenet of evolutionary developmental biology ("evo devo") is that adaptive mutations affecting morphology are more likely to occur in the cis-regulatory regions than in the protein-coding regions of genes. This argument rests on two claims: (1) the modular nature of cis-regulatory elements largely frees them from deleterious pleiotropic effects, and (2) a growing body of empirical evidence appears to support the predominant role of gene regulatory change in adaptation, especially morphological adaptation. Here we discuss and critique these assertions. We first show that there is no theoretical or empirical basis for the evo devo contention that adaptations involving morphology evolve by genetic mechanisms different from those involving physiology and other traits. In addition, some forms of protein evolution can avoid the negative consequences of pleiotropy, most notably via gene duplication. In light of evo devo claims, we then examine the substantial data on the genetic basis of adaptation from both genome-wide surveys and single-locus studies. Genomic studies lend little support to the cis-regulatory theory: many of these have detected adaptation in protein-coding regions, including transcription factors, whereas few have examined regulatory regions. Turning to single-locus studies, we note that the most widely cited examples of adaptive cis-regulatory mutations focus on trait loss rather than gain, and none have yet pinpointed an evolved regulatory site. In contrast, there are many studies that have both identified structural mutations and functionally verified their contribution to adaptation and speciation. Neither the theoretical arguments nor the data from nature, then, support the claim for a predominance of cis-regulatory mutations in evolution. Although this claim may be true, it is at best premature. Adaptation and speciation probably proceed through a combination of cis-regulatory and structural mutations, with a substantial contribution of the latter.
Coordinated within-trial dynamics of low-frequency neural rhythms controls evidence accumulation.
Werkle-Bergner, Markus; Grandy, Thomas H; Chicherio, Christian; Schmiedek, Florian; Lövdén, Martin; Lindenberger, Ulman
2014-06-18
Higher cognitive functions, such as human perceptual decision making, require information processing and transmission across wide-spread cortical networks. Temporally synchronized neural firing patterns are advantageous for efficiently representing and transmitting information within and between assemblies. Computational, empirical, and conceptual considerations all lead to the expectation that the informational redundancy of neural firing rates is positively related to their synchronization. Recent theorizing and initial evidence also suggest that the coding of stimulus characteristics and their integration with behavioral goal states require neural interactions across a hierarchy of timescales. However, most studies thus have focused on neural activity in a single frequency range or on a restricted set of brain regions. Here we provide evidence for cooperative spatiotemporal dynamics of slow and fast EEG signals during perceptual decision making at the single-trial level. Participants performed three masked two-choice decision tasks, one each with numerical, verbal, or figural content. Decrements in posterior α power (8-14 Hz) were paralleled by increments in high-frequency (>30 Hz) signal entropy in trials demanding active sensory processing. Simultaneously, frontocentral θ power (4-7 Hz) increased, indicating evidence integration. The coordinated α/θ dynamics were tightly linked to decision speed and remarkably similar across tasks, suggesting a domain-general mechanism. In sum, we demonstrate an inverse association between decision-related changes in widespread low-frequency power and local high-frequency entropy. The cooperation among mechanisms captured by these changes enhances the informational density of neural response patterns and qualifies as a neural coding system in the service of perceptual decision making. Copyright © 2014 the authors 0270-6474/14/348519-10$15.00/0.
Green, Nancy
2005-04-01
We developed a Bayesian network coding scheme for annotating biomedical content in layperson-oriented clinical genetics documents. The coding scheme supports the representation of probabilistic and causal relationships among concepts in this domain, at a high enough level of abstraction to capture commonalities among genetic processes and their relationship to health. We are using the coding scheme to annotate a corpus of genetic counseling patient letters as part of the requirements analysis and knowledge acquisition phase of a natural language generation project. This paper describes the coding scheme and presents an evaluation of intercoder reliability for its tag set. In addition to giving examples of use of the coding scheme for analysis of discourse and linguistic features in this genre, we suggest other uses for it in analysis of layperson-oriented text and dialogue in medical communication.
High-Speed Digital Interferometry
NASA Technical Reports Server (NTRS)
De Vine, Glenn; Shaddock, Daniel A.; Ware, Brent; Spero, Robert E.; Wuchenich, Danielle M.; Klipstein, William M.; McKenzie, Kirk
2012-01-01
Digitally enhanced heterodyne interferometry (DI) is a laser metrology technique employing pseudo-random noise (PRN) codes phase-modulated onto an optical carrier. Combined with heterodyne interferometry, the PRN code is used to select individual signals, returning the inherent interferometric sensitivity determined by the optical wavelength. The signal isolation arises from the autocorrelation properties of the PRN code, enabling both rejection of spurious signals (e.g., from scattered light) and multiplexing capability using a single metrology system. The minimum separation of optical components is determined by the wavelength of the PRN code.
Towards Realistic Implementations of a Majorana Surface Code.
Landau, L A; Plugge, S; Sela, E; Altland, A; Albrecht, S M; Egger, R
2016-02-05
Surface codes have emerged as promising candidates for quantum information processing. Building on the previous idea to realize the physical qubits of such systems in terms of Majorana bound states supported by topological semiconductor nanowires, we show that the basic code operations, namely projective stabilizer measurements and qubit manipulations, can be implemented by conventional tunnel conductance probes and charge pumping via single-electron transistors, respectively. The simplicity of the access scheme suggests that a functional code might be in close experimental reach.
Shital Kiran, D P; Bargale, Seema; Pandya, Parth; Bhatt, Kuntal; Barad, Nirav; Shah, Nilay; Venkataraghavan, Karthik; Ramesh, K
2015-08-01
The aim of this study was to evaluate the reliability of websites on the thumb sucking habit using DISCERN instrument and Health on the Net (HON) seal code at a single moment in time. An Internet search engine (www.google.com) was used to identify websites comprising information on "thumb sucking habit." Of over 204,000 links for thumb sucking habit, the first 100 were analyzed in detail. After excluding discussion groups, news and video feeds, and removing carbon copy sites, only 36 relevant websites remained, which were then assessed using the DISCERN instrument and HON seal code. Using the 16 questions of DISCERN for assessing the reliability and quality of the consumer information which were scored from 1 to 5, an appropriate index of the quality of the information was generated. All the assessed websites were also checked for presence or absence of HON seal code. The maximum score attainable for an outstanding website is 80. Of the 36 websites that were scored the highest score obtained by one of the websites according to the DISCERN tool was 55 of 80, and the lowest score achieved was 16 of 80. The websites achieving the maximum and minimum score were children.webmd.com and thebehaviorsolution.com, respectively. The HON seal was displayed only in three websites, which were medicinenet.com, righthealth.com, and children.webmd.com. By directing patients to validated websites on the thumb sucking habit, clinicians can ensure patients find appropriate information.
Chery, Joyce G; Sass, Chodon; Specht, Chelsea D
2017-09-01
We developed a bioinformatic pipeline that leverages a publicly available genome and published transcriptomes to design primers in conserved coding sequences flanking targeted introns of single-copy nuclear loci. Paullinieae (Sapindaceae) is used to demonstrate the pipeline. Transcriptome reads phylogenetically closer to the lineage of interest are aligned to the closest genome. Single-nucleotide polymorphisms are called, generating a "pseudoreference" closer to the lineage of interest. Several filters are applied to meet the criteria of single-copy nuclear loci with introns of a desired size. Primers are designed in conserved coding sequences flanking introns. Using this pipeline, we developed nine single-copy nuclear intron markers for Paullinieae. This pipeline is highly flexible and can be used for any group with available genomic and transcriptomic resources. This pipeline led to the development of nine variable markers for phylogenetic study without generating sequence data de novo.
Codon usage patterns in Nematoda: analysis based on over 25 million codons in thirty-two species
2006-01-01
Background Codon usage has direct utility in molecular characterization of species and is also a marker for molecular evolution. To understand codon usage within the diverse phylum Nematoda, we analyzed a total of 265,494 expressed sequence tags (ESTs) from 30 nematode species. The full genomes of Caenorhabditis elegans and C. briggsae were also examined. A total of 25,871,325 codons were analyzed and a comprehensive codon usage table for all species was generated. This is the first codon usage table available for 24 of these organisms. Results Codon usage similarity in Nematoda usually persists over the breadth of a genus but then rapidly diminishes even within each clade. Globodera, Meloidogyne, Pristionchus, and Strongyloides have the most highly derived patterns of codon usage. The major factor affecting differences in codon usage between species is the coding sequence GC content, which varies in nematodes from 32% to 51%. Coding GC content (measured as GC3) also explains much of the observed variation in the effective number of codons (R = 0.70), which is a measure of codon bias, and it even accounts for differences in amino acid frequency. Codon usage is also affected by neighboring nucleotides (N1 context). Coding GC content correlates strongly with estimated noncoding genomic GC content (R = 0.92). On examining abundant clusters in five species, candidate optimal codons were identified that may be preferred in highly expressed transcripts. Conclusion Evolutionary models indicate that total genomic GC content, probably the product of directional mutation pressure, drives codon usage rather than the converse, a conclusion that is supported by examination of nematode genomes. PMID:26271136
Extension of the XGC code for global gyrokinetic simulations in stellarator geometry
NASA Astrophysics Data System (ADS)
Cole, Michael; Moritaka, Toseo; White, Roscoe; Hager, Robert; Ku, Seung-Hoe; Chang, Choong-Seock
2017-10-01
In this work, the total-f, gyrokinetic particle-in-cell code XGC is extended to treat stellarator geometries. Improvements to meshing tools and the code itself have enabled the first physics studies, including single particle tracing and flux surface mapping in the magnetic geometry of the heliotron LHD and quasi-isodynamic stellarator Wendelstein 7-X. These have provided the first successful test cases for our approach. XGC is uniquely placed to model the complex edge physics of stellarators. A roadmap to such a global confinement modeling capability will be presented. Single particle studies will include the physics of energetic particles' global stochastic motions and their effect on confinement. Good confinement of energetic particles is vital for a successful stellarator reactor design. These results can be compared in the core region with those of other codes, such as ORBIT3d. In subsequent work, neoclassical transport and turbulence can then be considered and compared to results from codes such as EUTERPE and GENE. After sufficient verification in the core region, XGC will move into the stellarator edge region including the material wall and neutral particle recycling.
A User''s Guide to the Zwikker-Kosten Transmission Line Code (ZKTL)
NASA Technical Reports Server (NTRS)
Kelly, J. J.; Abu-Khajeel, H.
1997-01-01
This user's guide documents updates to the Zwikker-Kosten Transmission Line Code (ZKTL). This code was developed for analyzing new liner concepts developed to provide increased sound absorption. Contiguous arrays of multi-degree-of-freedom (MDOF) liner elements serve as the model for these liner configurations, and Zwikker and Kosten's theory of sound propagation in channels is used to predict the surface impedance. Transmission matrices for the various liner elements incorporate both analytical and semi-empirical methods. This allows standard matrix techniques to be employed in the code to systematically calculate the composite impedance due to the individual liner elements. The ZKTL code consists of four independent subroutines: 1. Single channel impedance calculation - linear version (SCIC) 2. Single channel impedance calculation - nonlinear version (SCICNL) 3. Multi-channel, multi-segment, multi-layer impedance calculation - linear version (MCMSML) 4. Multi-channel, multi-segment, multi-layer impedance calculation - nonlinear version (MCMSMLNL) Detailed examples, comments, and explanations for each liner impedance computation module are included. Also contained in the guide are depictions of the interactive execution, input files and output files.
A novel approach of an absolute coding pattern based on Hamiltonian graph
NASA Astrophysics Data System (ADS)
Wang, Ya'nan; Wang, Huawei; Hao, Fusheng; Liu, Liqiang
2017-02-01
In this paper, a novel approach of an optical type absolute rotary encoder coding pattern is presented. The concept is based on the principle of the absolute encoder to find out a unique sequence that ensures an unambiguous shaft position of any angular. We design a single-ring and a n-by-2 matrix absolute encoder coding pattern by using the variations of Hamiltonian graph principle. 12 encoding bits is used in the single-ring by a linear array CCD to achieve an 1080-position cycle encoding. Besides, a 2-by-2 matrix is used as an unit in the 2-track disk to achieve a 16-bits encoding pattern by using an area array CCD sensor (as a sample). Finally, a higher resolution can be gained by an electronic subdivision of the signals. Compared with the conventional gray or binary code pattern (for a 2n resolution), this new pattern has a higher resolution (2n*n) with less coding tracks, which means the new pattern can lead to a smaller encoder, which is essential in the industrial production.
A forward error correction technique using a high-speed, high-rate single chip codec
NASA Astrophysics Data System (ADS)
Boyd, R. W.; Hartman, W. F.; Jones, Robert E.
The authors describe an error-correction coding approach that allows operation in either burst or continuous modes at data rates of multiple hundreds of megabits per second. Bandspreading is low since the code rate is 7/8 or greater, which is consistent with high-rate link operation. The encoder, along with a hard-decision decoder, fits on a single application-specific integrated circuit (ASIC) chip. Soft-decision decoding is possible utilizing applique hardware in conjunction with the hard-decision decoder. Expected coding gain is a function of the application and is approximately 2.5 dB for hard-decision decoding at 10-5 bit-error rate with phase-shift-keying modulation and additive Gaussian white noise interference. The principal use envisioned for this technique is to achieve a modest amount of coding gain on high-data-rate, bandwidth-constrained channels. Data rates of up to 300 Mb/s can be accommodated by the codec chip. The major objective is burst-mode communications, where code words are composed of 32 n data bits followed by 32 overhead bits.
Huang, Ya-Yi; Matzke, Antonius J. M.; Matzke, Marjori
2013-01-01
Coconut, a member of the palm family (Arecaceae), is one of the most economically important trees used by mankind. Despite its diverse morphology, coconut is recognized taxonomically as only a single species (Cocos nucifera L.). There are two major coconut varieties, tall and dwarf, the latter of which displays traits resulting from selection by humans. We report here the complete chloroplast (cp) genome of a dwarf coconut plant, and describe the gene content and organization, inverted repeat fluctuations, repeated sequence structure, and occurrence of RNA editing. Phylogenetic relationships of monocots were inferred based on 47 chloroplast protein-coding genes. Potential nodes for events of gene duplication and pseudogenization related to inverted repeat fluctuation were mapped onto the tree using parsimony criteria. We compare our findings with those from other palm species for which complete cp genome sequences are available. PMID:24023703
Hydrodynamic approach to the centrality dependence of di-hadron correlations
NASA Astrophysics Data System (ADS)
Castilho, Wagner M.; Qian, Wei-Liang; Gardim, Fernando G.; Hama, Yogiro; Kodama, Takeshi
2017-06-01
Measurements of di-hadron azimuthal correlations at different centralities for Au+Au collisions at 200 A GeV were reported by the PHENIX Collaboration. The data were presented for different ranges of transverse momentum. In particular, it was observed that the away-side correlation evolves from a double- to a single-peak structure when the centrality decreases. In this work, we show that these features naturally appear due to an interplay between the centrality-dependent smooth background elliptic flow and the one produced by event-by-event fluctuating peripheral tubes. To compare with the PHENIX data, we also carry out numerical simulations by using a hydrodynamical code nexspherio, and calculate the correlations by both cumulant and the ZYAM method employed by the PHENIX Collaboration. Our results are in reasonable agreement with the data. A brief discussion on the physical content of the present model and its difference from other viewpoints is also presented.
Huang, Ya-Yi; Matzke, Antonius J M; Matzke, Marjori
2013-01-01
Coconut, a member of the palm family (Arecaceae), is one of the most economically important trees used by mankind. Despite its diverse morphology, coconut is recognized taxonomically as only a single species (Cocos nucifera L.). There are two major coconut varieties, tall and dwarf, the latter of which displays traits resulting from selection by humans. We report here the complete chloroplast (cp) genome of a dwarf coconut plant, and describe the gene content and organization, inverted repeat fluctuations, repeated sequence structure, and occurrence of RNA editing. Phylogenetic relationships of monocots were inferred based on 47 chloroplast protein-coding genes. Potential nodes for events of gene duplication and pseudogenization related to inverted repeat fluctuation were mapped onto the tree using parsimony criteria. We compare our findings with those from other palm species for which complete cp genome sequences are available.
ERIC Educational Resources Information Center
Horwitz, Amy Beth
2010-01-01
The purpose of this investigation is to describe the outcomes of a multi-state study of written discipline policies in a high school setting. This study examines discipline codes of conduct and analyzes the content for behaviors ranging in severity (mild, moderate, and severe) while specifically examining the use of suspension as a punitive…
"You Can Speak German, Sir": On the Complexity of Teachers' L1 Use in CLIL
ERIC Educational Resources Information Center
Gierlinger, Erwin
2015-01-01
Classroom code switching in foreign language teaching is still a controversial issue whose status as a tool of both despair and desire continues to be hotly debated. As the teaching of content and language integrated learning (CLIL) is, by definition, concerned with the learning of a foreign language, one would expect the value of code switching…
ERIC Educational Resources Information Center
Martin, Peter W.; Espiritu, Clemencia C
1996-01-01
Examines how the teacher incorporates elements of both "Bahasa Melayu" and Brunei Malay into content lessons and views code switching in the primary classroom within the wider framework of community language norms and the linguistic pressures on students and teachers. Espiritu shares Martin's concern regarding the quantity and quality of…
2 CFR 1.105 - Organization and subtitle content.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 2 Grants and Agreements 1 2013-01-01 2013-01-01 false Organization and subtitle content. 1.105 Section 1.105 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1...
2 CFR 1.105 - Organization and subtitle content.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 2 Grants and Agreements 1 2014-01-01 2014-01-01 false Organization and subtitle content. 1.105 Section 1.105 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1...
2 CFR 1.105 - Organization and subtitle content.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 2 Grants and Agreements 1 2012-01-01 2012-01-01 false Organization and subtitle content. 1.105 Section 1.105 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1...
A Content Analysis of 10 Years of Clinical Supervision Articles in Counseling
ERIC Educational Resources Information Center
Bernard, Janine M.; Luke, Melissa
2015-01-01
This content analysis follows Borders's (2005) review of counseling supervision literature and includes 184 counselor supervision articles published over the past 10 years. Articles were coded as representing 1 of 3 research types or 1 of 3 conceptual types. Articles were then analyzed for main topics producing 11 topic categories.
Discovering Genres of Online Discussion Threads via Text Mining
ERIC Educational Resources Information Center
Lin, Fu-Ren; Hsieh, Lu-Shih; Chuang, Fu-Tai
2009-01-01
As course management systems (CMS) gain popularity in facilitating teaching. A forum is a key component to facilitate the interactions among students and teachers. Content analysis is the most popular way to study a discussion forum. But content analysis is a human labor intensity process; for example, the coding process relies heavily on manual…
78 FR 69793 - Voluntary Remedial Actions and Guidelines for Voluntary Recall Notices
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-21
... (Commission, CPSC, or we) proposes an interpretive rule to set forth principles and guidelines for the content... setting forth the Commission's principles and guidelines regarding the content of voluntary recall notices..., ``Principles and Guidelines for Voluntary Recall Notices,'' in part 1115 of title 16 of the Code of Federal...
Looking Back and Moving Ahead: A Content Analysis of Two Teacher Education Journals
ERIC Educational Resources Information Center
Rock, Marcia L.; Cheek, Aftynne E.; Sullivan, Melissa E.; Jones, Jennie L.; Holden, Kara B.; Kang, Jeongae
2016-01-01
We conducted a content analysis to examine trends in articles published between 1996 and 2014 in two journals--"Teacher Education and Special Education" ("TESE") and the "Journal of Teacher Education" ("JTE"). Across both journals, we coded 1,062 articles categorically based on multiple attributes (e.g.,…
Do Special Education Interventions Improve Learning of Secondary Content? A Meta-Analysis
ERIC Educational Resources Information Center
Scruggs, Thomas E.; Mastropieri, Margo A.; Berkeley, Sheri; Graetz, Janet E.
2010-01-01
The authors describe findings from a research synthesis on content area instruction for students with disabilities. Seventy studies were identified from a comprehensive literature search, examined, and coded for a number of variables, including weighted standardized mean-difference effect sizes. More than 2,400 students were participants in these…
Comprehending News Videotexts: The Influence of the Visual Content
ERIC Educational Resources Information Center
Cross, Jeremy
2011-01-01
Informed by dual coding theory, this study explores the role of the visual content in L2 listeners' comprehension of news videotexts. L1 research into the visual characteristics and comprehension of news videotexts is outlined, subsequently informing the quantitative analysis of audiovisual correspondence in the news videotexts used. In each of…
Princess Picture Books: Content and Messages
ERIC Educational Resources Information Center
Dale, Lourdes P.; Higgins, Brittany E.; Pinkerton, Nick; Couto, Michelle; Mansolillo, Victoria; Weisinger, Nica; Flores, Marci
2016-01-01
Because many girls develop their understanding of what it means to be a girl from books about princesses, the researchers coded the messages and content in 58 princess books (picture, fairy tales, and fractured fairy tales). Results indicate that gender stereotypes are present in the books--the princesses were more likely to be nurturing, in…
NASA Technical Reports Server (NTRS)
Capo, M. A.; Disney, R. K.
1971-01-01
The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.
Improved inter-layer prediction for light field content coding with display scalability
NASA Astrophysics Data System (ADS)
Conti, Caroline; Ducla Soares, Luís.; Nunes, Paulo
2016-09-01
Light field imaging based on microlens arrays - also known as plenoptic, holoscopic and integral imaging - has recently risen up as feasible and prospective technology due to its ability to support functionalities not straightforwardly available in conventional imaging systems, such as: post-production refocusing and depth of field changing. However, to gradually reach the consumer market and to provide interoperability with current 2D and 3D representations, a display scalable coding solution is essential. In this context, this paper proposes an improved display scalable light field codec comprising a three-layer hierarchical coding architecture (previously proposed by the authors) that provides interoperability with 2D (Base Layer) and 3D stereo and multiview (First Layer) representations, while the Second Layer supports the complete light field content. For further improving the compression performance, novel exemplar-based inter-layer coding tools are proposed here for the Second Layer, namely: (i) an inter-layer reference picture construction relying on an exemplar-based optimization algorithm for texture synthesis, and (ii) a direct prediction mode based on exemplar texture samples from lower layers. Experimental results show that the proposed solution performs better than the tested benchmark solutions, including the authors' previous scalable codec.
Kang, Jong-Soo; Lee, Byoung Yoon; Kwak, Myounghai
2017-01-01
The complete chloroplast genomes of Lychnis wilfordii and Silene capitata were determined and compared with ten previously reported Caryophyllaceae chloroplast genomes. The chloroplast genome sequences of L. wilfordii and S. capitata contain 152,320 bp and 150,224 bp, respectively. The gene contents and orders among 12 Caryophyllaceae species are consistent, but several microstructural changes have occurred. Expansion of the inverted repeat (IR) regions at the large single copy (LSC)/IRb and small single copy (SSC)/IR boundaries led to partial or entire gene duplications. Additionally, rearrangements of the LSC region were caused by gene inversions and/or transpositions. The 18 kb inversions, which occurred three times in different lineages of tribe Sileneae, were thought to be facilitated by the intermolecular duplicated sequences. Sequence analyses of the L. wilfordii and S. capitata genomes revealed 39 and 43 repeats, respectively, including forward, palindromic, and reverse repeats. In addition, a total of 67 and 56 simple sequence repeats were discovered in the L. wilfordii and S. capitata chloroplast genomes, respectively. Finally, we constructed phylogenetic trees of the 12 Caryophyllaceae species and two Amaranthaceae species based on 73 protein-coding genes using both maximum parsimony and likelihood methods.
Manganello, Jennifer A; Henderson, Vani R; Jordan, Amy; Trentacoste, Nicole; Martin, Suzanne; Hennessy, Michael; Fishbein, Martin
2010-07-01
Many studies of sexual messages in media utilize content analysis methods. At times, this research assumes that researchers and trained coders using content analysis methods and the intended audience view and interpret media content similarly. This article compares adolescents' perceptions of the presence or absence of sexual content on television to those of researchers using three different coding schemes. Results from this formative research study suggest that participants and researchers are most likely to agree with content categories assessing manifest content, and that differences exist among adolescents who view sexual messages on television. Researchers using content analysis methods to examine sexual content in media and media effects on sexual behavior should consider identifying how audience characteristics may affect interpretation of content and account for audience perspectives in content analysis study protocols when appropriate for study goals.
Guide to AERO2S and WINGDES Computer Codes for Prediction and Minimization of Drag Due to Lift
NASA Technical Reports Server (NTRS)
Carlson, Harry W.; Chu, Julio; Ozoroski, Lori P.; McCullers, L. Arnold
1997-01-01
The computer codes, AER02S and WINGDES, are now widely used for the analysis and design of airplane lifting surfaces under conditions that tend to induce flow separation. These codes have undergone continued development to provide additional capabilities since the introduction of the original versions over a decade ago. This code development has been reported in a variety of publications (NASA technical papers, NASA contractor reports, and society journals). Some modifications have not been publicized at all. Users of these codes have suggested the desirability of combining in a single document the descriptions of the code development, an outline of the features of each code, and suggestions for effective code usage. This report is intended to supply that need.
Tailored Codes for Small Quantum Memories
NASA Astrophysics Data System (ADS)
Robertson, Alan; Granade, Christopher; Bartlett, Stephen D.; Flammia, Steven T.
2017-12-01
We demonstrate that small quantum memories, realized via quantum error correction in multiqubit devices, can benefit substantially by choosing a quantum code that is tailored to the relevant error model of the system. For a biased noise model, with independent bit and phase flips occurring at different rates, we show that a single code greatly outperforms the well-studied Steane code across the full range of parameters of the noise model, including for unbiased noise. In fact, this tailored code performs almost optimally when compared with 10 000 randomly selected stabilizer codes of comparable experimental complexity. Tailored codes can even outperform the Steane code with realistic experimental noise, and without any increase in the experimental complexity, as we demonstrate by comparison in the observed error model in a recent seven-qubit trapped ion experiment.
Optimized scalar promotion with load and splat SIMD instructions
Eichenberger, Alexander E; Gschwind, Michael K; Gunnels, John A
2013-10-29
Mechanisms for optimizing scalar code executed on a single instruction multiple data (SIMD) engine are provided. Placement of vector operation-splat operations may be determined based on an identification of scalar and SIMD operations in an original code representation. The original code representation may be modified to insert the vector operation-splat operations based on the determined placement of vector operation-splat operations to generate a first modified code representation. Placement of separate splat operations may be determined based on identification of scalar and SIMD operations in the first modified code representation. The first modified code representation may be modified to insert or delete separate splat operations based on the determined placement of the separate splat operations to generate a second modified code representation. SIMD code may be output based on the second modified code representation for execution by the SIMD engine.
Optimized scalar promotion with load and splat SIMD instructions
Eichenberger, Alexandre E [Chappaqua, NY; Gschwind, Michael K [Chappaqua, NY; Gunnels, John A [Yorktown Heights, NY
2012-08-28
Mechanisms for optimizing scalar code executed on a single instruction multiple data (SIMD) engine are provided. Placement of vector operation-splat operations may be determined based on an identification of scalar and SIMD operations in an original code representation. The original code representation may be modified to insert the vector operation-splat operations based on the determined placement of vector operation-splat operations to generate a first modified code representation. Placement of separate splat operations may be determined based on identification of scalar and SIMD operations in the first modified code representation. The first modified code representation may be modified to insert or delete separate splat operations based on the determined placement of the separate splat operations to generate a second modified code representation. SIMD code may be output based on the second modified code representation for execution by the SIMD engine.
Lee, Ying-Li; Cui, Yan-Yan; Tu, Ming-Hsiang; Chen, Yu-Chi; Chang, Polun
2018-04-20
Chronic kidney disease (CKD) is a global health problem with a high economic burden, which is particularly prevalent in Taiwan. Mobile health apps have been widely used to maintain continuity of patient care for various chronic diseases. To slow the progression of CKD, continuity of care is vital for patients' self-management and cooperation with health care professionals. However, the literature provides a limited understanding of the use of mobile health apps to maintain continuity of patient-centered care for CKD. This study identified apps related to the continuity of patient-centered care for CKD on the App Store, Google Play, and 360 Mobile Assistant, and explored the information and frequency of changes in these apps available to the public on different platforms. App functionalities, like patient self-management and patient management support for health care professionals, were also examined. We used the CKD-related keywords "kidney," "renal," "nephro," "chronic kidney disease," "CKD," and "kidney disease" in traditional Chinese, simplified Chinese, and English to search 3 app platforms: App Store, Google Play, and 360 Mobile Assistant. A total of 2 reviewers reached consensus on coding guidelines and coded the contents and functionalities of the apps through content analysis. After coding, Microsoft Office Excel 2016 was used to calculate Cohen kappa coefficients and analyze the contents and functionalities of the apps. A total of 177 apps related to patient-centered care for CKD in any language were included. On the basis of their functionality and content, 67 apps were recommended for patients. Among them, the most common functionalities were CKD information and CKD self-management (38/67, 57%), e-consultation (17/67, 25%), CKD nutrition education (16/67, 24%), and estimated glomerular filtration rate (eGFR) calculators (13/67, 19%). In addition, 67 apps were recommended for health care professionals. The most common functionalities of these apps were comprehensive clinical calculators (including eGFR; 30/67; 45%), CKD medical professional information (16/67, 24%), stand-alone eGFR calculators (14/67, 21%), and CKD clinical decision support (14/67, 21%). A total of 43 apps with single- or multiple-indicator calculators were found to be suitable for health care professionals and patients. The aspects of patient care apps intended to support self-management of CKD patients were encouraging patients to actively participate in health care (92/110, 83.6%), recognizing and effectively responding to symptoms (56/110, 50.9%), and disease-specific knowledge (53/110, 48.2%). Only 13 apps contained consulting management functions, patient management functions or teleconsultation functions designed to support health care professionals in CKD patient management. This study revealed that the continuity of patient-centered care for CKD provided by mobile health apps is inadequate for both CKD self-management by patients and patient care support for health care professionals. More comprehensive solutions are required to enhance the continuity of patient-centered care for CKD. ©Ying-Li Lee, Yan-Yan Cui, Ming-Hsiang Tu, Yu-Chi Chen, Polun Chang. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 20.04.2018.
Lee, Ying-Li; Cui, Yan-Yan; Tu, Ming-Hsiang; Chen, Yu-Chi
2018-01-01
Background Chronic kidney disease (CKD) is a global health problem with a high economic burden, which is particularly prevalent in Taiwan. Mobile health apps have been widely used to maintain continuity of patient care for various chronic diseases. To slow the progression of CKD, continuity of care is vital for patients’ self-management and cooperation with health care professionals. However, the literature provides a limited understanding of the use of mobile health apps to maintain continuity of patient-centered care for CKD. Objective This study identified apps related to the continuity of patient-centered care for CKD on the App Store, Google Play, and 360 Mobile Assistant, and explored the information and frequency of changes in these apps available to the public on different platforms. App functionalities, like patient self-management and patient management support for health care professionals, were also examined. Methods We used the CKD-related keywords “kidney,” “renal,” “nephro,” “chronic kidney disease,” “CKD,” and “kidney disease” in traditional Chinese, simplified Chinese, and English to search 3 app platforms: App Store, Google Play, and 360 Mobile Assistant. A total of 2 reviewers reached consensus on coding guidelines and coded the contents and functionalities of the apps through content analysis. After coding, Microsoft Office Excel 2016 was used to calculate Cohen kappa coefficients and analyze the contents and functionalities of the apps. Results A total of 177 apps related to patient-centered care for CKD in any language were included. On the basis of their functionality and content, 67 apps were recommended for patients. Among them, the most common functionalities were CKD information and CKD self-management (38/67, 57%), e-consultation (17/67, 25%), CKD nutrition education (16/67, 24%), and estimated glomerular filtration rate (eGFR) calculators (13/67, 19%). In addition, 67 apps were recommended for health care professionals. The most common functionalities of these apps were comprehensive clinical calculators (including eGFR; 30/67; 45%), CKD medical professional information (16/67, 24%), stand-alone eGFR calculators (14/67, 21%), and CKD clinical decision support (14/67, 21%). A total of 43 apps with single- or multiple-indicator calculators were found to be suitable for health care professionals and patients. The aspects of patient care apps intended to support self-management of CKD patients were encouraging patients to actively participate in health care (92/110, 83.6%), recognizing and effectively responding to symptoms (56/110, 50.9%), and disease-specific knowledge (53/110, 48.2%). Only 13 apps contained consulting management functions, patient management functions or teleconsultation functions designed to support health care professionals in CKD patient management. Conclusions This study revealed that the continuity of patient-centered care for CKD provided by mobile health apps is inadequate for both CKD self-management by patients and patient care support for health care professionals. More comprehensive solutions are required to enhance the continuity of patient-centered care for CKD. PMID:29678805
Baurens, Franc-Christophe; Bocs, Stéphanie; Rouard, Mathieu; Matsumoto, Takashi; Miller, Robert N G; Rodier-Goud, Marguerite; MBéguié-A-MBéguié, Didier; Yahiaoui, Nabila
2010-07-16
Comparative sequence analysis of complex loci such as resistance gene analog clusters allows estimating the degree of sequence conservation and mechanisms of divergence at the intraspecies level. In banana (Musa sp.), two diploid wild species Musa acuminata (A genome) and Musa balbisiana (B genome) contribute to the polyploid genome of many cultivars. The M. balbisiana species is associated with vigour and tolerance to pests and disease and little is known on the genome structure and haplotype diversity within this species. Here, we compare two genomic sequences of 253 and 223 kb corresponding to two haplotypes of the RGA08 resistance gene analog locus in M. balbisiana "Pisang Klutuk Wulung" (PKW). Sequence comparison revealed two regions of contrasting features. The first is a highly colinear gene-rich region where the two haplotypes diverge only by single nucleotide polymorphisms and two repetitive element insertions. The second corresponds to a large cluster of RGA08 genes, with 13 and 18 predicted RGA genes and pseudogenes spread over 131 and 152 kb respectively on each haplotype. The RGA08 cluster is enriched in repetitive element insertions, in duplicated non-coding intergenic sequences including low complexity regions and shows structural variations between haplotypes. Although some allelic relationships are retained, a large diversity of RGA08 genes occurs in this single M. balbisiana genotype, with several RGA08 paralogs specific to each haplotype. The RGA08 gene family has evolved by mechanisms of unequal recombination, intragenic sequence exchange and diversifying selection. An unequal recombination event taking place between duplicated non-coding intergenic sequences resulted in a different RGA08 gene content between haplotypes pointing out the role of such duplicated regions in the evolution of RGA clusters. Based on the synonymous substitution rate in coding sequences, we estimated a 1 million year divergence time for these M. balbisiana haplotypes. A large RGA08 gene cluster identified in wild banana corresponds to a highly variable genomic region between haplotypes surrounded by conserved flanking regions. High level of sequence identity (70 to 99%) of the genic and intergenic regions suggests a recent and rapid evolution of this cluster in M. balbisiana.
Nang, Roberto N; Monahan, Felicia; Diehl, Glendon B; French, Daniel
2015-04-01
Many institutions collect reports in databases to make important lessons-learned available to their members. The Uniformed Services University of the Health Sciences collaborated with the Peacekeeping and Stability Operations Institute to conduct a descriptive and qualitative analysis of global health engagements (GHEs) contained in the Stability Operations Lessons Learned and Information Management System (SOLLIMS). This study used a summative qualitative content analysis approach involving six steps: (1) a comprehensive search; (2) two-stage reading and screening process to identify first-hand, health-related records; (3) qualitative and quantitative data analysis using MAXQDA, a software program; (4) a word cloud to illustrate word frequencies and interrelationships; (5) coding of individual themes and validation of the coding scheme; and (6) identification of relationships in the data and overarching lessons-learned. The individual codes with the most number of text segments coded included: planning, personnel, interorganizational coordination, communication/information sharing, and resources/supplies. When compared to the Department of Defense's (DoD's) evolving GHE principles and capabilities, the SOLLIMS coding scheme appeared to align well with the list of GHE capabilities developed by the Department of Defense Global Health Working Group. The results of this study will inform practitioners of global health and encourage additional qualitative analysis of other lessons-learned databases. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
Nonlinear dynamic simulation of single- and multi-spool core engines
NASA Technical Reports Server (NTRS)
Schobeiri, T.; Lippke, C.; Abouelkheir, M.
1993-01-01
In this paper a new computational method for accurate simulation of the nonlinear dynamic behavior of single- and multi-spool core engines, turbofan engines, and power generation gas turbine engines is presented. In order to perform the simulation, a modularly structured computer code has been developed which includes individual mathematical modules representing various engine components. The generic structure of the code enables the dynamic simulation of arbitrary engine configurations ranging from single-spool thrust generation to multi-spool thrust/power generation engines under adverse dynamic operating conditions. For precise simulation of turbine and compressor components, row-by-row calculation procedures were implemented that account for the specific turbine and compressor cascade and blade geometry and characteristics. The dynamic behavior of the subject engine is calculated by solving a number of systems of partial differential equations, which describe the unsteady behavior of the individual components. In order to ensure the capability, accuracy, robustness, and reliability of the code, comprehensive critical performance assessment and validation tests were performed. As representatives, three different transient cases with single- and multi-spool thrust and power generation engines were simulated. The transient cases range from operating with a prescribed fuel schedule, to extreme load changes, to generator and turbine shut down.
NASA Astrophysics Data System (ADS)
Samsing, Johan; Askar, Abbas; Giersz, Mirek
2018-03-01
We estimate the population of eccentric gravitational wave (GW) binary black hole (BBH) mergers forming during binary–single interactions in globular clusters (GCs), using ∼800 GC models that were evolved using the MOCCA code for star cluster simulations as part of the MOCCA-Survey Database I project. By re-simulating BH binary–single interactions extracted from this set of GC models using an N-body code that includes GW emission at the 2.5 post-Newtonian level, we find that ∼10% of all the BBHs assembled in our GC models that merge at present time form during chaotic binary–single interactions, and that about half of this sample have an eccentricity >0.1 at 10 Hz. We explicitly show that this derived rate of eccentric mergers is ∼100 times higher than one would find with a purely Newtonian N-body code. Furthermore, we demonstrate that the eccentric fraction can be accurately estimated using a simple analytical formalism when the interacting BHs are of similar mass, a result that serves as the first successful analytical description of eccentric GW mergers forming during three-body interactions in realistic GCs.
NASA Astrophysics Data System (ADS)
Chan, V. S.; Wong, C. P. C.; McLean, A. G.; Luo, G. N.; Wirth, B. D.
2013-10-01
The Xolotl code under development by PSI-SciDAC will enhance predictive modeling capability of plasma-facing materials under burning plasma conditions. The availability and application of experimental data to compare to code-calculated observables are key requirements to validate the breadth and content of physics included in the model and ultimately gain confidence in its results. A dedicated effort has been in progress to collect and organize a) a database of relevant experiments and their publications as previously carried out at sample exposure facilities in US and Asian tokamaks (e.g., DIII-D DiMES, and EAST MAPES), b) diagnostic and surface analysis capabilities available at each device, and c) requirements for future experiments with code validation in mind. The content of this evolving database will serve as a significant resource for the plasma-material interaction (PMI) community. Work supported in part by the US Department of Energy under GA-DE-SC0008698, DE-AC52-07NA27344 and DE-AC05-00OR22725.
Low-complexity transcoding algorithm from H.264/AVC to SVC using data mining
NASA Astrophysics Data System (ADS)
Garrido-Cantos, Rosario; De Cock, Jan; Martínez, Jose Luis; Van Leuven, Sebastian; Cuenca, Pedro; Garrido, Antonio
2013-12-01
Nowadays, networks and terminals with diverse characteristics of bandwidth and capabilities coexist. To ensure a good quality of experience, this diverse environment demands adaptability of the video stream. In general, video contents are compressed to save storage capacity and to reduce the bandwidth required for its transmission. Therefore, if these compressed video streams were compressed using scalable video coding schemes, they would be able to adapt to those heterogeneous networks and a wide range of terminals. Since the majority of the multimedia contents are compressed using H.264/AVC, they cannot benefit from that scalability. This paper proposes a low-complexity algorithm to convert an H.264/AVC bitstream without scalability to scalable bitstreams with temporal scalability in baseline and main profiles by accelerating the mode decision task of the scalable video coding encoding stage using machine learning tools. The results show that when our technique is applied, the complexity is reduced by 87% while maintaining coding efficiency.
CREME96 and Related Error Rate Prediction Methods
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.
2012-01-01
Predicting the rate of occurrence of single event effects (SEEs) in space requires knowledge of the radiation environment and the response of electronic devices to that environment. Several analytical models have been developed over the past 36 years to predict SEE rates. The first error rate calculations were performed by Binder, Smith and Holman. Bradford and Pickel and Blandford, in their CRIER (Cosmic-Ray-Induced-Error-Rate) analysis code introduced the basic Rectangular ParallelePiped (RPP) method for error rate calculations. For the radiation environment at the part, both made use of the Cosmic Ray LET (Linear Energy Transfer) spectra calculated by Heinrich for various absorber Depths. A more detailed model for the space radiation environment within spacecraft was developed by Adams and co-workers. This model, together with a reformulation of the RPP method published by Pickel and Blandford, was used to create the CR ME (Cosmic Ray Effects on Micro-Electronics) code. About the same time Shapiro wrote the CRUP (Cosmic Ray Upset Program) based on the RPP method published by Bradford. It was the first code to specifically take into account charge collection from outside the depletion region due to deformation of the electric field caused by the incident cosmic ray. Other early rate prediction methods and codes include the Single Event Figure of Merit, NOVICE, the Space Radiation code and the effective flux method of Binder which is the basis of the SEFA (Scott Effective Flux Approximation) model. By the early 1990s it was becoming clear that CREME and the other early models needed Revision. This revision, CREME96, was completed and released as a WWW-based tool, one of the first of its kind. The revisions in CREME96 included improved environmental models and improved models for calculating single event effects. The need for a revision of CREME also stimulated the development of the CHIME (CRRES/SPACERAD Heavy Ion Model of the Environment) and MACREE (Modeling and Analysis of Cosmic Ray Effects in Electronics). The Single Event Figure of Merit method was also revised to use the solar minimum galactic cosmic ray spectrum and extended to circular orbits down to 200 km at any inclination. More recently a series of commercial codes was developed by TRAD (Test & Radiations) which includes the OMERE code which calculates single event effects. There are other error rate prediction methods which use Monte Carlo techniques. In this chapter the analytic methods for estimating the environment within spacecraft will be discussed.
Fast and Flexible Successive-Cancellation List Decoders for Polar Codes
NASA Astrophysics Data System (ADS)
Hashemi, Seyyed Ali; Condo, Carlo; Gross, Warren J.
2017-11-01
Polar codes have gained significant amount of attention during the past few years and have been selected as a coding scheme for the next generation of mobile broadband standard. Among decoding schemes, successive-cancellation list (SCL) decoding provides a reasonable trade-off between the error-correction performance and hardware implementation complexity when used to decode polar codes, at the cost of limited throughput. The simplified SCL (SSCL) and its extension SSCL-SPC increase the speed of decoding by removing redundant calculations when encountering particular information and frozen bit patterns (rate one and single parity check codes), while keeping the error-correction performance unaltered. In this paper, we improve SSCL and SSCL-SPC by proving that the list size imposes a specific number of bit estimations required to decode rate one and single parity check codes. Thus, the number of estimations can be limited while guaranteeing exactly the same error-correction performance as if all bits of the code were estimated. We call the new decoding algorithms Fast-SSCL and Fast-SSCL-SPC. Moreover, we show that the number of bit estimations in a practical application can be tuned to achieve desirable speed, while keeping the error-correction performance almost unchanged. Hardware architectures implementing both algorithms are then described and implemented: it is shown that our design can achieve 1.86 Gb/s throughput, higher than the best state-of-the-art decoders.
Sakurai, Y
2002-01-01
This study reports how hippocampal individual cells and cell assemblies cooperate for neural coding of pitch and temporal information in memory processes for auditory stimuli. Each rat performed two tasks, one requiring discrimination of auditory pitch (high or low) and the other requiring discrimination of their duration (long or short). Some CA1 and CA3 complex-spike neurons showed task-related differential activity between the high and low tones in only the pitch-discrimination task. However, without exception, neurons which showed task-related differential activity between the long and short tones in the duration-discrimination task were always task-related neurons in the pitch-discrimination task. These results suggest that temporal information (long or short), in contrast to pitch information (high or low), cannot be coded independently by specific neurons. The results also indicate that the two different behavioral tasks cannot be fully differentiated by the task-related single neurons alone and suggest a model of cell-assembly coding of the tasks. Cross-correlation analysis among activities of simultaneously recorded multiple neurons supported the suggested cell-assembly model.Considering those results, this study concludes that dual coding by hippocampal single neurons and cell assemblies is working in memory processing of pitch and temporal information of auditory stimuli. The single neurons encode both auditory pitches and their temporal lengths and the cell assemblies encode types of tasks (contexts or situations) in which the pitch and the temporal information are processed.
Death of a dogma: eukaryotic mRNAs can code for more than one protein.
Mouilleron, Hélène; Delcourt, Vivian; Roucou, Xavier
2016-01-08
mRNAs carry the genetic information that is translated by ribosomes. The traditional view of a mature eukaryotic mRNA is a molecule with three main regions, the 5' UTR, the protein coding open reading frame (ORF) or coding sequence (CDS), and the 3' UTR. This concept assumes that ribosomes translate one ORF only, generally the longest one, and produce one protein. As a result, in the early days of genomics and bioinformatics, one CDS was associated with each protein-coding gene. This fundamental concept of a single CDS is being challenged by increasing experimental evidence indicating that annotated proteins are not the only proteins translated from mRNAs. In particular, mass spectrometry (MS)-based proteomics and ribosome profiling have detected productive translation of alternative open reading frames. In several cases, the alternative and annotated proteins interact. Thus, the expression of two or more proteins translated from the same mRNA may offer a mechanism to ensure the co-expression of proteins which have functional interactions. Translational mechanisms already described in eukaryotic cells indicate that the cellular machinery is able to translate different CDSs from a single viral or cellular mRNA. In addition to summarizing data showing that the protein coding potential of eukaryotic mRNAs has been underestimated, this review aims to challenge the single translated CDS dogma. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Dual coding theory, word abstractness, and emotion: a critical review of Kousta et al. (2011).
Paivio, Allan
2013-02-01
Kousta, Vigliocco, Del Campo, Vinson, and Andrews (2011) questioned the adequacy of dual coding theory and the context availability model as explanations of representational and processing differences between concrete and abstract words. They proposed an alternative approach that focuses on the role of emotional content in the processing of abstract concepts. Their dual coding critique is, however, based on impoverished and, in some respects, incorrect interpretations of the theory and its implications. This response corrects those gaps and misinterpretations and summarizes research findings that show predicted variations in the effects of dual coding variables in different tasks and contexts. Especially emphasized is an empirically supported dual coding theory of emotion that goes beyond the Kousta et al. emphasis on emotion in abstract semantics. 2013 APA, all rights reserved
VizieR Online Data Catalog: Radiative forces for stellar envelopes (Seaton, 1997)
NASA Astrophysics Data System (ADS)
Seaton, M. J.; Yan, Y.; Mihalas, D.; Pradhan, A. K.
2000-02-01
(1) Primary data files, stages.zz These files give data for the calculation of radiative accelerations, GRAD, for elements with nuclear charge zz. Data are available for zz=06, 07, 08, 10, 11, 12, 13, 14, 16, 18, 20, 24, 25, 26 and 28. Calculations are made using data from the Opacity Project (see papers SYMP and IXZ). The data are given for each ionisation stage, j. They are tabulated on a mesh of (T, Ne, CHI) where T is temperature, Ne electron density and CHI is abundance multiplier. The files include data for ionisation fractions, for each (T, Ne). The file contents are described in the paper ACC and as comments in the code add.f (2) Code add.f This reads a file stages.zz and creates a file acc.zz giving radiative accelerations averaged over ionisation stages. The code prompts for names of input and output files. The code, as provided, gives equal weights (as defined in the paper ACC) to all stages. Th weights are set in SUBROUTINE WEIGHTS, which could be changed to give any weights preferred by the user. The dependence of diffusion coefficients on ionisation stage is given by a function ZET, which is defined in SUBROUTINE ZETA. The expressions used for ZET are as given in the paper. The user can change that subroutine if other expressions are preferred. The output file contains values, ZETBAR, of ZET, averaged over ionisation stages. (3) Files acc.zz Radiative accelerations computed using add.f as provided. The user will need to run the code add.f only if it is required to change the subroutines WEIGHTS or ZETA. The contents of the files acc.zz are described in the paper ACC and in comments contained in the code add.f. (4) Code accfit.f This code gives gives radiative accelerations, and some related data, for a stellar model. Methods used to interpolate data to the values of (T, RHO) for the stellar model are based on those used in the code opfit.for (see the paper OPF). The executable file accfit.com runs accfit.f. It uses a list of files given in accfit.files (see that file for further description). The mesh used for the abundance-multiplier CHI on the output file will generally be finer than that used in the input files acc.zz. The mesh to be used is specified on a file chi.dat. For a test run, the stellar model used is given in the file 10000_4.2 (Teff=10000 K, LOG10(g)=4.2) The output file from that test run is acc100004.2. The contents of the output file are described in the paper ACC and as comments in the code accfit.f. (5) The code diff.f This code reads the output file (e.g. acc1000004.2) created by accfit.f. For any specified depth point in the model and value of CHI, it gives values of radiative accelerations, the quantity ZETBAR required for calculation of diffusion coefficients, and Rosseland-mean opacities. The code prompts for input data. It creates a file recording all data calculated. The code diff.f is intended for incorporation, as a set of subroutines, in codes for diffusion calculations. (1 data file).
Yang, Guohu; Dong, Yongbin; Li, Yuling; Wang, Qilei; Shi, Qingling; Zhou, Qiang
2013-01-01
Grain oil content is negatively correlated with starch content in maize in general. In this study, 282 and 263 recombinant inbred lines (RIL) developed from two crosses between one high-oil maize inbred and two normal dent maize inbreds were evaluated for grain starch content and its correlation with oil content under four environments. Single-trait QTL for starch content in single-population and joint-population analysis, and multiple-trait QTL for both starch and oil content were detected, and compared with the result obtained in the two related F2∶3 populations. Totally, 20 single-population QTL for grain starch content were detected. No QTL was simultaneously detected across all ten cases. QTL at bins 5.03 and 9.03 were all detected in both populations and in 4 and 5 cases, respectively. Only 2 of the 16 joint-population QTL had significant effects in both populations. Three single-population QTL and 8 joint-population QTL at bins 1.03, 1.04–1.05, 3.05, 8.04–8.05, 9.03, and 9.05 could be considered as fine-mapped. Common QTL across F2∶3 and RIL generations were observed at bins 5.04, 8.04 and 8.05 in population 1 (Pop.1), and at bin 5.03 in population 2 (Pop.2). QTL at bins 3.02–3.03, 3.05, 8.04–8.05 and 9.03 should be focused in high-starch maize breeding. In multiple-trait QTL analysis, 17 starch-oil QTL were detected, 10 in Pop.1 and 7 in Pop.2. And 22 single-trait QTL failed to show significance in multiple-trait analysis, 13 QTL for starch content and 9 QTL for oil content. However, QTL at bins 1.03, 6.03–6.04 and 8.03–8.04 might increase grain starch content and/or grain oil content without reduction in another trait. Further research should be conducted to validate the effect of these QTL in the simultaneous improvement of grain starch and oil content in maize. PMID:23320103
Simple scheme for encoding and decoding a qubit in unknown state for various topological codes
Łodyga, Justyna; Mazurek, Paweł; Grudka, Andrzej; Horodecki, Michał
2015-01-01
We present a scheme for encoding and decoding an unknown state for CSS codes, based on syndrome measurements. We illustrate our method by means of Kitaev toric code, defected-lattice code, topological subsystem code and 3D Haah code. The protocol is local whenever in a given code the crossings between the logical operators consist of next neighbour pairs, which holds for the above codes. For subsystem code we also present scheme in a noisy case, where we allow for bit and phase-flip errors on qubits as well as state preparation and syndrome measurement errors. Similar scheme can be built for two other codes. We show that the fidelity of the protected qubit in the noisy scenario in a large code size limit is of , where p is a probability of error on a single qubit per time step. Regarding Haah code we provide noiseless scheme, leaving the noisy case as an open problem. PMID:25754905
The multidimensional Self-Adaptive Grid code, SAGE, version 2
NASA Technical Reports Server (NTRS)
Davies, Carol B.; Venkatapathy, Ethiraj
1995-01-01
This new report on Version 2 of the SAGE code includes all the information in the original publication plus all upgrades and changes to the SAGE code since that time. The two most significant upgrades are the inclusion of a finite-volume option and the ability to adapt and manipulate zonal-matching multiple-grid files. In addition, the original SAGE code has been upgraded to Version 1.1 and includes all options mentioned in this report, with the exception of the multiple grid option and its associated features. Since Version 2 is a larger and more complex code, it is suggested (but not required) that Version 1.1 be used for single-grid applications. This document contains all the information required to run both versions of SAGE. The formulation of the adaption method is described in the first section of this document. The second section is presented in the form of a user guide that explains the input and execution of the code. The third section provides many examples. Successful application of the SAGE code in both two and three dimensions for the solution of various flow problems has proven the code to be robust, portable, and simple to use. Although the basic formulation follows the method of Nakahashi and Deiwert, many modifications have been made to facilitate the use of the self-adaptive grid method for complex grid structures. Modifications to the method and the simple but extensive input options make this a flexible and user-friendly code. The SAGE code can accommodate two-dimensional and three-dimensional, finite-difference and finite-volume, single grid, and zonal-matching multiple grid flow problems.
78 FR 44189 - Petition for Modification of Single Car Air Brake Test Procedures
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-23
...] Petition for Modification of Single Car Air Brake Test Procedures In accordance with Part 232 of Title 49... Administration (FRA) per 49 CFR 232.307 to modify the single car air brake test procedures located in AAR Standard S-486, Code of Air Brake System Tests for Freight Equipment-- Single Car Test, and required...
Alvarado, David M; Yang, Ping; Druley, Todd E; Lovett, Michael; Gurnett, Christina A
2014-06-01
Despite declining sequencing costs, few methods are available for cost-effective single-nucleotide polymorphism (SNP), insertion/deletion (INDEL) and copy number variation (CNV) discovery in a single assay. Commercially available methods require a high investment to a specific region and are only cost-effective for large samples. Here, we introduce a novel, flexible approach for multiplexed targeted sequencing and CNV analysis of large genomic regions called multiplexed direct genomic selection (MDiGS). MDiGS combines biotinylated bacterial artificial chromosome (BAC) capture and multiplexed pooled capture for SNP/INDEL and CNV detection of 96 multiplexed samples on a single MiSeq run. MDiGS is advantageous over other methods for CNV detection because pooled sample capture and hybridization to large contiguous BAC baits reduces sample and probe hybridization variability inherent in other methods. We performed MDiGS capture for three chromosomal regions consisting of ∼ 550 kb of coding and non-coding sequence with DNA from 253 patients with congenital lower limb disorders. PITX1 nonsense and HOXC11 S191F missense mutations were identified that segregate in clubfoot families. Using a novel pooled-capture reference strategy, we identified recurrent chromosome chr17q23.1q23.2 duplications and small HOXC 5' cluster deletions (51 kb and 12 kb). Given the current interest in coding and non-coding variants in human disease, MDiGS fulfills a niche for comprehensive and low-cost evaluation of CNVs, coding, and non-coding variants across candidate regions of interest. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Support for Systematic Code Reviews with the SCRUB Tool
NASA Technical Reports Server (NTRS)
Holzmann, Gerald J.
2010-01-01
SCRUB is a code review tool that supports both large, team-based software development efforts (e.g., for mission software) as well as individual tasks. The tool was developed at JPL to support a new, streamlined code review process that combines human-generated review reports with program-generated review reports from a customizable range of state-of-the-art source code analyzers. The leading commercial tools include Codesonar, Coverity, and Klocwork, each of which can achieve a reasonably low rate of false-positives in the warnings that they generate. The time required to analyze code with these tools can vary greatly. In each case, however, the tools produce results that would be difficult to realize with human code inspections alone. There is little overlap in the results produced by the different analyzers, and each analyzer used generally increases the effectiveness of the overall effort. The SCRUB tool allows all reports to be accessed through a single, uniform interface (see figure) that facilitates brows ing code and reports. Improvements over existing software include significant simplification, and leveraging of a range of commercial, static source code analyzers in a single, uniform framework. The tool runs as a small stand-alone application, avoiding the security problems related to tools based on Web browsers. A developer or reviewer, for instance, must have already obtained access rights to a code base before that code can be browsed and reviewed with the SCRUB tool. The tool cannot open any files or folders to which the user does not already have access. This means that the tool does not need to enforce or administer any additional security policies. The analysis results presented through the SCRUB tool s user interface are always computed off-line, given that, especially for larger projects, this computation can take longer than appropriate for interactive tool use. The recommended code review process that is supported by the SCRUB tool consists of three phases: Code Review, Developer Response, and Closeout Resolution. In the Code Review phase, all tool-based analysis reports are generated, and specific comments from expert code reviewers are entered into the SCRUB tool. In the second phase, Developer Response, the developer is asked to respond to each comment and tool-report that was produced, either agreeing or disagreeing to provide a fix that addresses the issue that was raised. In the third phase, Closeout Resolution, all disagreements are discussed in a meeting of all parties involved, and a resolution is made for all disagreements. The first two phases generally take one week each, and the third phase is concluded in a single closeout meeting.
Assessment of the Effects of Entrainment and Wind Shear on Nuclear Cloud Rise Modeling
NASA Astrophysics Data System (ADS)
Zalewski, Daniel; Jodoin, Vincent
2001-04-01
Accurate modeling of nuclear cloud rise is critical in hazard prediction following a nuclear detonation. This thesis recommends improvements to the model currently used by DOD. It considers a single-term versus a three-term entrainment equation, the value of the entrainment and eddy viscous drag parameters, as well as the effect of wind shear in the cloud rise following a nuclear detonation. It examines departures from the 1979 version of the Department of Defense Land Fallout Interpretive Code (DELFIC) with the current code used in the Hazard Prediction and Assessment Capability (HPAC) code version 3.2. The recommendation for a single-term entrainment equation, with constant value parameters, without wind shear corrections, and without cloud oscillations is based on both a statistical analysis using 67 U.S. nuclear atmospheric test shots and the physical representation of the modeling. The statistical analysis optimized the parameter values of interest for four cases: the three-term entrainment equation with wind shear and without wind shear as well as the single-term entrainment equation with and without wind shear. The thesis then examines the effect of cloud oscillations as a significant departure in the code. Modifications to user input atmospheric tables are identified as a potential problem in the calculation of stabilized cloud dimensions in HPAC.
Caregiving, single parents and cumulative stresses when caring for a child with cancer.
Granek, L; Rosenberg-Yunger, Z R S; Dix, D; Klaassen, R J; Sung, L; Cairney, J; Klassen, A F
2014-03-01
Single parents whose children have cancer are a marginalized group who report less family centred care, and therefore, less quality cancer care for their children. As such, the aims of this study were to explore how single parents of children with cancer describe their caregiving experiences and to understand their contextual life stressors. A constructivist grounded theory method was used. Qualitative interviews with 29 single parents of children with cancer who were at least 6 months post-diagnosis were recruited between November 2009 and April 2011 from four hospitals across Canada. Line-by-line coding was used to establish codes and themes and constant comparison was used to establish relationships among emerging codes and conceptual themes. The first set of findings report on caregiving duties including: emotional tasks, informational tasks and physical tasks. The second set of findings report on the contextual picture of parent's lives including their living conditions, their physical and mental health and their family histories of disruption, trauma and disease. Single parents caring for children with cancer were found to experience several cumulative stressors in addition to the current strain of caring for a child with cancer. The synergy of these cumulative stresses with the added strain of caregiving for a child with cancer may have long-term health and financial implications for parents. Broad-based policy interventions should focus on relieving the chronic strains associated with being a single parent of a child with cancer. © 2012 John Wiley & Sons Ltd.
Direct Sequence Spread Spectrum (DSSS) Receiver, User Manual
2008-01-01
sampled data is clocked in to correlator data registers and a comparison is made between the code and data register contents, producing a correlation ...symbol (equal to the processing gain Gp ) but need not be otherwise synchronised with the spreading codes . This allows a very long and noise- like PRBS...and Q channels are independently but synchronously sampled . Complex Real ADC FIR Filter Interpolator Acquisition Correlators
Superdense Coding over Optical Fiber Links with Complete Bell-State Measurements
Williams, Brian P.; Sadlier, Ronald J.; Humble, Travis S.
2017-02-01
Adopting quantum communication to modern networking requires transmitting quantum information through a fiber-based infrastructure. In this paper, we report the first demonstration of superdense coding over optical fiber links, taking advantage of a complete Bell-state measurement enabled by time-polarization hyperentanglement, linear optics, and common single-photon detectors. Finally, we demonstrate the highest single-qubit channel capacity to date utilizing linear optics, 1.665 ± 0.018, and we provide a full experimental implementation of a hybrid, quantum-classical communication protocol for image transfer.
1986-09-30
4 . ~**..ft.. ft . - - - ft SI TABLES 9 I. SA32~40 Single Event Upset Test, 1140-MeV Krypton, 9/l8/8~4. . .. .. .. .. .. .16 II. CRUP Simulation...cosmic ray interaction analysis described in the remainder of this report were calculated using the CRUP computer code 3 modified for funneling. The... CRUP code requires, as inputs, the size of a depletion region specified as a retangular parallel piped with dimensions a 9 b S c, the effective funnel
Writing in the Secondary-Level Disciplines: A Systematic Review of Context, Cognition, and Content
ERIC Educational Resources Information Center
Miller, Diane M.; Scott, Chyllis E.; McTigue, Erin M.
2018-01-01
Situated within the historical and current state of writing and adolescent literacy research, this systematic literature review screened 3504 articles to determine the prevalent themes in current research on writing tasks in content-area classrooms. Each of the 3504 studies was evaluated and coded using seven methodological quality indicators. The…
2 CFR 1.105 - Organization and subtitle content.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 2 Grants and Agreements 1 2011-01-01 2011-01-01 false Organization and subtitle content. 1.105 Section 1.105 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Title 2 of the CFR § 1...
ERIC Educational Resources Information Center
Egmir, Eray; Erdem, Cahit; Koçyigit, Mehmet
2017-01-01
The aim of this study is to analyse the studies published in "International Journal of Instruction" ["IJI"] in the last ten years. This study is a qualitative, descriptive literature review study. The data was collected through document analysis, coded using constant comparison and analysed using content analysis. Frequencies…
Forest value orientations in Australia: an application of computer content analysis
Trevor J. Webb; David N. Bengston; David P. Fan
2008-01-01
This article explores the expression of three forest value orientations that emerged from an analysis of Australian news media discourse about the management of Australian native forests from August 1, 1997 through December 31, 2004. Computer-coded content analysis was used to measure and track the relative importance of commodity, ecological and moral/spiritual/...
Content and Language Integrated Learning through an Online Game in Primary School: A Case Study
ERIC Educational Resources Information Center
Dourda, Kyriaki; Bratitsis, Tharrenos; Griva, Eleni; Papadopoulou, Penelope
2014-01-01
In this paper an educational design proposal is presented which combines two well established teaching approaches, that of Game-based Learning (GBL) and Content and Language Integrated Learning (CLIL). The context of the proposal was the design of an educational geography computer game, utilizing QR Codes and Google Earth for teaching English…
An Open Source Agenda for Research Linking Text and Image Content Features.
ERIC Educational Resources Information Center
Goodrum, Abby A.; Rorvig, Mark E.; Jeong, Ki-Tai; Suresh, Chitturi
2001-01-01
Proposes methods to utilize image primitives to support term assignment for image classification. Proposes to release code for image analysis in a common tool set for other researchers to use. Of particular focus is the expansion of work by researchers in image indexing to include image content-based feature extraction capabilities in their work.…
Pretty Risky Behavior: A Content Analysis of Youth Risk Behaviors in "Pretty Little Liars"
ERIC Educational Resources Information Center
Hall, Cougar; West, Joshua; Herbert, Patrick C.
2015-01-01
Adolescent consumption of screen media continues to increase. A variety of theoretical constructs hypothesize the impact of media content on health-related attitudes, beliefs, and behaviors. This study uses a coding instrument based on the Centers for Disease Control and Prevention Youth Risk Behavior Survey to analyze health behavior contained in…
Supports for Vocabulary Instruction in Early Language and Literacy Methods Textbooks
ERIC Educational Resources Information Center
Wright, Tanya S.; Peltier, Marliese R.
2016-01-01
The goal of this study was to examine the extent to which the content and recommendations in recently published early language and literacy methods textbooks may support early childhood teachers in learning to provide vocabulary instruction for young children. We completed a content analysis of 9 textbooks with coding at the sentence level.…
The Challenges of Planning Language Objectives in Content-Based ESL Instruction
ERIC Educational Resources Information Center
Baecher, Laura; Farnsworth, Tim; Ediger, Anne
2014-01-01
The purpose of this research was to investigate the major patterns in content-based instruction (CBI) lesson plans among practicum teachers at the final stage of an MA TESOL program. One hundred and seven lesson plans were coded according to a typology developed to evaluate clarity and identify areas of potential difficulty in the design of…
ERIC Educational Resources Information Center
Jones, Dustin; Hollas, Victoria; Klespis, Mark
2017-01-01
This article presents an overview of the ways technology is presented in textbooks written for mathematics content courses for prospective elementary teachers. Six popular textbooks comprising a total of more than 5,000 pages were examined, and 1,055 distinct references to technology were identified. These references are coded according to…
Children's Literature on Recycling: What Does It Contribute to Environmental Literacy?
ERIC Educational Resources Information Center
Christenson, Mary A.
2008-01-01
This article addresses the content and use of children's literature to teach about recycling. Twenty children's books were examined using a coding system created to compare and generalize about the content of the books. It was found that although the books can be useful for providing basic information they fall short in asking children to think…
ERIC Educational Resources Information Center
Scott, Wendy; Suh, Yonghee
2015-01-01
This content analysis explored how Civics and Government textbooks and the Virginia Standards of Learning for Civics and Government courses reflect citizenship outcomes, specifically deconstructing the unique needs of marginalized students. The coding frame was constructed by using themes and categories from previous literature, specifically…
21 CFR 172.340 - Fish protein isolate.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 3 2014-04-01 2014-04-01 false Fish protein isolate. 172.340 Section 172.340 Food.../federal_register/code_of_federal_regulations/ibr_locations.html. (1) Protein content, as N × 6.25, shall... described in section 24.003, Air Drying (1)—Official First Action. (3) Fat content shall not be more than 0...
ERIC Educational Resources Information Center
Erdogan, Ibrahim; Campbell, Todd; Abd-Hamid, Nor Hashidah
2011-01-01
This study describes the development of an instrument to investigate the extent to which student-centered actions are occurring in science classrooms. The instrument was developed through the following five stages: (1) student action identification, (2) use of both national and international content experts to establish content validity, (3)…
ERIC Educational Resources Information Center
Sims, Judy R.; Giordano, Joseph
A research study assessed the amount of front page newspaper coverage allotted to "character/competence/image" issues versus "platform/political" issues in the 1992 presidential campaign. Using textual analysis, methodology of content analysis, researchers coded the front page of the following 5 newspapers between August 1 and…
An evaluation of four single element airfoil analytic methods
NASA Technical Reports Server (NTRS)
Freuler, R. J.; Gregorek, G. M.
1979-01-01
A comparison of four computer codes for the analysis of two-dimensional single element airfoil sections is presented for three classes of section geometries. Two of the computer codes utilize vortex singularities methods to obtain the potential flow solution. The other two codes solve the full inviscid potential flow equation using finite differencing techniques, allowing results to be obtained for transonic flow about an airfoil including weak shocks. Each program incorporates boundary layer routines for computing the boundary layer displacement thickness and boundary layer effects on aerodynamic coefficients. Computational results are given for a symmetrical section represented by an NACA 0012 profile, a conventional section illustrated by an NACA 65A413 profile, and a supercritical type section for general aviation applications typified by a NASA LS(1)-0413 section. The four codes are compared and contrasted in the areas of method of approach, range of applicability, agreement among each other and with experiment, individual advantages and disadvantages, computer run times and memory requirements, and operational idiosyncrasies.
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu (Inventor)
1997-01-01
A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu (Inventor)
1998-01-01
A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.
Employing multi-GPU power for molecular dynamics simulation: an extension of GALAMOST
NASA Astrophysics Data System (ADS)
Zhu, You-Liang; Pan, Deng; Li, Zhan-Wei; Liu, Hong; Qian, Hu-Jun; Zhao, Yang; Lu, Zhong-Yuan; Sun, Zhao-Yan
2018-04-01
We describe the algorithm of employing multi-GPU power on the basis of Message Passing Interface (MPI) domain decomposition in a molecular dynamics code, GALAMOST, which is designed for the coarse-grained simulation of soft matters. The code of multi-GPU version is developed based on our previous single-GPU version. In multi-GPU runs, one GPU takes charge of one domain and runs single-GPU code path. The communication between neighbouring domains takes a similar algorithm of CPU-based code of LAMMPS, but is optimised specifically for GPUs. We employ a memory-saving design which can enlarge maximum system size at the same device condition. An optimisation algorithm is employed to prolong the update period of neighbour list. We demonstrate good performance of multi-GPU runs on the simulation of Lennard-Jones liquid, dissipative particle dynamics liquid, polymer and nanoparticle composite, and two-patch particles on workstation. A good scaling of many nodes on cluster for two-patch particles is presented.
Regoui, Chaouki; Durand, Guillaume; Belliveau, Luc; Léger, Serge
2013-01-01
This paper presents a novel hybrid DNA encryption (HyDEn) approach that uses randomized assignments of unique error-correcting DNA Hamming code words for single characters in the extended ASCII set. HyDEn relies on custom-built quaternary codes and a private key used in the randomized assignment of code words and the cyclic permutations applied on the encoded message. Along with its ability to detect and correct errors, HyDEn equals or outperforms existing cryptographic methods and represents a promising in silico DNA steganographic approach. PMID:23984392
Alcohol Content in the 'Hyper-Reality' MTV Show 'Geordie Shore'.
Lowe, Eden; Britton, John; Cranwell, Jo
2018-05-01
To quantify the occurrence of alcohol content, including alcohol branding, in the popular primetime television UK Reality TV show 'Geordie Shore' Series 11. A 1-min interval coding content analysis of alcohol content in the entire DVD Series 11 of 'Geordie Shore' (10 episodes). Occurrence of alcohol use, implied use, other alcohol reference/paraphernalia or branding was recorded. All categories of alcohol were present in all episodes. 'Any alcohol' content occurred in 78%, 'actual alcohol use' in 30%, 'inferred alcohol use' in 72%, and all 'other' alcohol references occurred in 59% of all coding intervals (ACIs), respectively. Brand appearances occurred in 23% of ACIs. The most frequently observed alcohol brand was Smirnoff which appeared in 43% of all brand appearances. Episodes categorized as suitable for viewing by adolescents below the legal drinking age of 18 years comprised of 61% of all brand appearances. Alcohol content, including branding, is highly prevalent in the UK Reality TV show 'Geordie Shore' Series 11. Two-thirds of all alcohol branding occurred in episodes age-rated by the British Board of Film Classification (BBFC) as suitable for viewers aged 15 years. The organizations OfCom, Advertising Standards Authority (ASA) and the Portman Group should implement more effective policies to reduce adolescent exposure to on-screen drinking. The drinks industry should consider demanding the withdrawal of their brands from the show. Alcohol content, including branding, is highly prevalent in the MTV reality TV show 'Geordie Shore' Series 11. Current alcohol regulation is failing to protect young viewers from exposure to such content.
Current Research on Non-Coding Ribonucleic Acid (RNA).
Wang, Jing; Samuels, David C; Zhao, Shilin; Xiang, Yu; Zhao, Ying-Yong; Guo, Yan
2017-12-05
Non-coding ribonucleic acid (RNA) has without a doubt captured the interest of biomedical researchers. The ability to screen the entire human genome with high-throughput sequencing technology has greatly enhanced the identification, annotation and prediction of the functionality of non-coding RNAs. In this review, we discuss the current landscape of non-coding RNA research and quantitative analysis. Non-coding RNA will be categorized into two major groups by size: long non-coding RNAs and small RNAs. In long non-coding RNA, we discuss regular long non-coding RNA, pseudogenes and circular RNA. In small RNA, we discuss miRNA, transfer RNA, piwi-interacting RNA, small nucleolar RNA, small nuclear RNA, Y RNA, single recognition particle RNA, and 7SK RNA. We elaborate on the origin, detection method, and potential association with disease, putative functional mechanisms, and public resources for these non-coding RNAs. We aim to provide readers with a complete overview of non-coding RNAs and incite additional interest in non-coding RNA research.
2006-01-01
collected, code both. Code Type of Analysis Code Type of Analysis A Physical properties I Common ions/trace elements B Common ions J Sanitary analysis and...1) A ground-water site is coded as if it is a single point, not a geographic area or property . (2) Latitude and longitude should be determined at a...terrace from an adjacent upland on one side, and a lowland coast or valley on the other. Due to the effects of erosion, the terrace surface may not be as
With or without you: predictive coding and Bayesian inference in the brain
Aitchison, Laurence; Lengyel, Máté
2018-01-01
Two theoretical ideas have emerged recently with the ambition to provide a unifying functional explanation of neural population coding and dynamics: predictive coding and Bayesian inference. Here, we describe the two theories and their combination into a single framework: Bayesian predictive coding. We clarify how the two theories can be distinguished, despite sharing core computational concepts and addressing an overlapping set of empirical phenomena. We argue that predictive coding is an algorithmic / representational motif that can serve several different computational goals of which Bayesian inference is but one. Conversely, while Bayesian inference can utilize predictive coding, it can also be realized by a variety of other representations. We critically evaluate the experimental evidence supporting Bayesian predictive coding and discuss how to test it more directly. PMID:28942084
NASA Astrophysics Data System (ADS)
Juhler, Martin Vogt
2016-08-01
Recent research, both internationally and in Norway, has clearly expressed concerns about missing connections between subject-matter knowledge, pedagogical competence and real-life practice in schools. This study addresses this problem within the domain of field practice in teacher education, studying pre-service teachers' planning of a Physics lesson. Two means of intervention were introduced. The first was lesson study, which is a method for planning, carrying out and reflecting on a research lesson in detail with a learner and content-centered focus. This was used in combination with a second means, content representations, which is a systematic tool that connects overall teaching aims with pedagogical prompts. Changes in teaching were assessed through the construct of pedagogical content knowledge (PCK). A deductive coding analysis was carried out for this purpose. Transcripts of pre-service teachers' planning of a Physics lesson were coded into four main PCK categories, which were thereafter divided into 16 PCK sub-categories. The results showed that the intervention affected the pre-service teachers' potential to start developing PCK. First, they focused much more on categories concerning the learners. Second, they focused far more uniformly in all of the four main categories comprising PCK. Consequently, these differences could affect their potential to start developing PCK.
Nadimi, Maryam; Daubois, Laurence; Hijri, Mohamed
2016-05-01
Mitochondrial (mt) genes, such as cytochrome C oxidase genes (cox), have been widely used for barcoding in many groups of organisms, although this approach has been less powerful in the fungal kingdom due to the rapid evolution of their mt genomes. The use of mt genes in phylogenetic studies of Dikarya has been met with success, while early diverging fungal lineages remain less studied, particularly the arbuscular mycorrhizal fungi (AMF). Advances in next-generation sequencing have substantially increased the number of publically available mtDNA sequences for the Glomeromycota. As a result, comparison of mtDNA across key AMF taxa can now be applied to assess the phylogenetic signal of individual mt coding genes, as well as concatenated subsets of coding genes. Here we show comparative analyses of publically available mt genomes of Glomeromycota, augmented with two mtDNA genomes that were newly sequenced for this study (Rhizophagus irregularis DAOM240159 and Glomus aggregatum DAOM240163), resulting in 16 complete mtDNA datasets. R. irregularis isolate DAOM240159 and G. aggregatum isolate DAOM240163 showed mt genomes measuring 72,293bp and 69,505bp with G+C contents of 37.1% and 37.3%, respectively. We assessed the phylogenies inferred from single mt genes and complete sets of coding genes, which are referred to as "supergenes" (16 concatenated coding genes), using Shimodaira-Hasegawa tests, in order to identify genes that best described AMF phylogeny. We found that rnl, nad5, cox1, and nad2 genes, as well as concatenated subset of these genes, provided phylogenies that were similar to the supergene set. This mitochondrial genomic analysis was also combined with principal coordinate and partitioning analyses, which helped to unravel certain evolutionary relationships in the Rhizophagus genus and for G. aggregatum within the Glomeromycota. We showed evidence to support the position of G. aggregatum within the R. irregularis 'species complex'. Copyright © 2016 Elsevier Inc. All rights reserved.
McEvoy, Matthew D.; Smalley, Jeremy C.; Nietert, Paul J.; Field, Larry C.; Furse, Cory M.; Blenko, John W.; Cobb, Benjamin G.; Walters, Jenna L.; Pendarvis, Allen; Dalal, Nishita S.; Schaefer, John J.
2012-01-01
Introduction Defining valid, reliable, defensible, and generalizable standards for the evaluation of learner performance is a key issue in assessing both baseline competence and mastery in medical education. However, prior to setting these standards of performance, the reliability of the scores yielding from a grading tool must be assessed. Accordingly, the purpose of this study was to assess the reliability of scores generated from a set of grading checklists used by non-expert raters during simulations of American Heart Association (AHA) MegaCodes. Methods The reliability of scores generated from a detailed set of checklists, when used by four non-expert raters, was tested by grading team leader performance in eight MegaCode scenarios. Videos of the scenarios were reviewed and rated by trained faculty facilitators and by a group of non-expert raters. The videos were reviewed “continuously” and “with pauses.” Two content experts served as the reference standard for grading, and four non-expert raters were used to test the reliability of the checklists. Results Our results demonstrate that non-expert raters are able to produce reliable grades when using the checklists under consideration, demonstrating excellent intra-rater reliability and agreement with a reference standard. The results also demonstrate that non-expert raters can be trained in the proper use of the checklist in a short amount of time, with no discernible learning curve thereafter. Finally, our results show that a single trained rater can achieve reliable scores of team leader performance during AHA MegaCodes when using our checklist in continuous mode, as measures of agreement in total scoring were very strong (Lin’s Concordance Correlation Coefficient = 0.96; Intraclass Correlation Coefficient = 0.97). Discussion We have shown that our checklists can yield reliable scores, are appropriate for use by non-expert raters, and are able to be employed during continuous assessment of team leader performance during the review of a simulated MegaCode. This checklist may be more appropriate for use by Advanced Cardiac Life Support (ACLS) instructors during MegaCode assessments than current tools provided by the AHA. PMID:22863996
Impact of packet losses in scalable 3D holoscopic video coding
NASA Astrophysics Data System (ADS)
Conti, Caroline; Nunes, Paulo; Ducla Soares, Luís.
2014-05-01
Holoscopic imaging became a prospective glassless 3D technology to provide more natural 3D viewing experiences to the end user. Additionally, holoscopic systems also allow new post-production degrees of freedom, such as controlling the plane of focus or the viewing angle presented to the user. However, to successfully introduce this technology into the consumer market, a display scalable coding approach is essential to achieve backward compatibility with legacy 2D and 3D displays. Moreover, to effectively transmit 3D holoscopic content over error-prone networks, e.g., wireless networks or the Internet, error resilience techniques are required to mitigate the impact of data impairments in the user quality perception. Therefore, it is essential to deeply understand the impact of packet losses in terms of decoding video quality for the specific case of 3D holoscopic content, notably when a scalable approach is used. In this context, this paper studies the impact of packet losses when using a three-layer display scalable 3D holoscopic video coding architecture previously proposed, where each layer represents a different level of display scalability (i.e., L0 - 2D, L1 - stereo or multiview, and L2 - full 3D holoscopic). For this, a simple error concealment algorithm is used, which makes use of inter-layer redundancy between multiview and 3D holoscopic content and the inherent correlation of the 3D holoscopic content to estimate lost data. Furthermore, a study of the influence of 2D views generation parameters used in lower layers on the performance of the used error concealment algorithm is also presented.
High-Content Optical Codes for Protecting Rapid Diagnostic Tests from Counterfeiting.
Gökçe, Onur; Mercandetti, Cristina; Delamarche, Emmanuel
2018-06-19
Warnings and reports on counterfeit diagnostic devices are released several times a year by regulators and public health agencies. Unfortunately, mishandling, altering, and counterfeiting point-of-care diagnostics (POCDs) and rapid diagnostic tests (RDTs) is lucrative, relatively simple and can lead to devastating consequences. Here, we demonstrate how to implement optical security codes in silicon- and nitrocellulose-based flow paths for device authentication using a smartphone. The codes are created by inkjet spotting inks directly on nitrocellulose or on micropillars. Codes containing up to 32 elements per mm 2 and 8 colors can encode as many as 10 45 combinations. Codes on silicon micropillars can be erased by setting a continuous flow path across the entire array of code elements or for nitrocellulose by simply wicking a liquid across the code. Static or labile code elements can further be formed on nitrocellulose to create a hidden code using poly(ethylene glycol) (PEG) or glycerol additives to the inks. More advanced codes having a specific deletion sequence can also be created in silicon microfluidic devices using an array of passive routing nodes, which activate in a particular, programmable sequence. Such codes are simple to fabricate, easy to view, and efficient in coding information; they can be ideally used in combination with information on a package to protect diagnostic devices from counterfeiting.
Meher, J K; Meher, P K; Dash, G N; Raval, M K
2012-01-01
The first step in gene identification problem based on genomic signal processing is to convert character strings into numerical sequences. These numerical sequences are then analysed spectrally or using digital filtering techniques for the period-3 peaks, which are present in exons (coding areas) and absent in introns (non-coding areas). In this paper, we have shown that single-indicator sequences can be generated by encoding schemes based on physico-chemical properties. Two new methods are proposed for generating single-indicator sequences based on hydration energy and dipole moments. The proposed methods produce high peak at exon locations and effectively suppress false exons (intron regions having greater peak than exon regions) resulting in high discriminating factor, sensitivity and specificity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chakraborty, S.; Kroposki, B.; Kramer, W.
Integrating renewable energy and distributed generations into the Smart Grid architecture requires power electronic (PE) for energy conversion. The key to reaching successful Smart Grid implementation is to develop interoperable, intelligent, and advanced PE technology that improves and accelerates the use of distributed energy resource systems. This report describes the simulation, design, and testing of a single-phase DC-to-AC inverter developed to operate in both islanded and utility-connected mode. It provides results on both the simulations and the experiments conducted, demonstrating the ability of the inverter to provide advanced control functions such as power flow and VAR/voltage regulation. This report alsomore » analyzes two different techniques used for digital signal processor (DSP) code generation. Initially, the DSP code was written in C programming language using Texas Instrument's Code Composer Studio. In a later stage of the research, the Simulink DSP toolbox was used to self-generate code for the DSP. The successful tests using Simulink self-generated DSP codes show promise for fast prototyping of PE controls.« less
NASA Astrophysics Data System (ADS)
Elgaud, M. M.; Zan, M. S. D.; Abushagur, A. G.; Bakar, A. Ashrif A.
2017-07-01
This paper reports the employment of autocorrelation properties of Golay complementary codes (GCC) to enhance the performance of the time domain multiplexing fiber Bragg grating (TDM-FBG) sensing network. By encoding the light from laser with a stream of non-return-to-zero (NRZ) form of GCC and launching it into the sensing area that consists of the FBG sensors, we have found that the FBG signals can be decoded correctly with the autocorrelation calculations, confirming the successful demonstration of coded TDM-FBG sensor network. OptiGrating and OptiSystem simulators were used to design customized FBG sensors and perform the coded TDM-FBG sensor simulations, respectively. Results have substantiated the theoretical dependence of SNR enhancement on the code length of GCC, where the maximum SNR improvement of about 9 dB is achievable with the use of 256 bits of GCC compared to that of 4 bits case. Furthermore, the GCC has also extended the strain exposure up to 30% higher compared to the maximum of the conventional single pulse case. The employment of GCC in the TDM-FBG sensor system provides overall performance enhancement over the conventional single pulse case, under the same conditions.
Large-Signal Code TESLA: Current Status and Recent Development
2008-04-01
K.Eppley, J.J.Petillo, “ High - power four cavity S - band multiple- beam klystron design”, IEEE Trans. Plasma Sci. , vol. 32, pp. 1119-1135, June 2004. 4...advances in the development of the large-signal code TESLA, mainly used for the modeling of high - power single- beam and multiple-beam klystron ...amplifiers. Keywords: large-signal code; multiple-beam klystrons ; serial and parallel versions. Introduction The optimization and design of new high power
Asymmetric Memory Circuit Would Resist Soft Errors
NASA Technical Reports Server (NTRS)
Buehler, Martin G.; Perlman, Marvin
1990-01-01
Some nonlinear error-correcting codes more efficient in presence of asymmetry. Combination of circuit-design and coding concepts expected to make integrated-circuit random-access memories more resistant to "soft" errors (temporary bit errors, also called "single-event upsets" due to ionizing radiation). Integrated circuit of new type made deliberately more susceptible to one kind of bit error than to other, and associated error-correcting code adapted to exploit this asymmetry in error probabilities.
GPS receiver CODE bias estimation: A comparison of two methods
NASA Astrophysics Data System (ADS)
McCaffrey, Anthony M.; Jayachandran, P. T.; Themens, D. R.; Langley, R. B.
2017-04-01
The Global Positioning System (GPS) is a valuable tool in the measurement and monitoring of ionospheric total electron content (TEC). To obtain accurate GPS-derived TEC, satellite and receiver hardware biases, known as differential code biases (DCBs), must be estimated and removed. The Center for Orbit Determination in Europe (CODE) provides monthly averages of receiver DCBs for a significant number of stations in the International Global Navigation Satellite Systems Service (IGS) network. A comparison of the monthly receiver DCBs provided by CODE with DCBs estimated using the minimization of standard deviations (MSD) method on both daily and monthly time intervals, is presented. Calibrated TEC obtained using CODE-derived DCBs, is accurate to within 0.74 TEC units (TECU) in differenced slant TEC (sTEC), while calibrated sTEC using MSD-derived DCBs results in an accuracy of 1.48 TECU.
NASA Electronic Library System (NELS) optimization
NASA Technical Reports Server (NTRS)
Pribyl, William L.
1993-01-01
This is a compilation of NELS (NASA Electronic Library System) Optimization progress/problem, interim, and final reports for all phases. The NELS database was examined, particularly in the memory, disk contention, and CPU, to discover bottlenecks. Methods to increase the speed of NELS code were investigated. The tasks included restructuring the existing code to interact with others more effectively. An error reporting code to help detect and remove bugs in the NELS was added. Report writing tools were recommended to integrate with the ASV3 system. The Oracle database management system and tools were to be installed on a Sun workstation, intended for demonstration purposes.
Cheng, Hui; Li, Jinfeng; Zhang, Hong; Cai, Binhua; Gao, Zhihong
2017-01-01
Compared with other members of the family Rosaceae, the chloroplast genomes of Fragaria species exhibit low variation, and this situation has limited phylogenetic analyses; thus, complete chloroplast genome sequencing of Fragaria species is needed. In this study, we sequenced the complete chloroplast genome of F. × ananassa ‘Benihoppe’ using the Illumina HiSeq 2500-PE150 platform and then performed a combination of de novo assembly and reference-guided mapping of contigs to generate complete chloroplast genome sequences. The chloroplast genome exhibits a typical quadripartite structure with a pair of inverted repeats (IRs, 25,936 bp) separated by large (LSC, 85,531 bp) and small (SSC, 18,146 bp) single-copy (SC) regions. The length of the F. × ananassa ‘Benihoppe’ chloroplast genome is 155,549 bp, representing the smallest Fragaria chloroplast genome observed to date. The genome encodes 112 unique genes, comprising 78 protein-coding genes, 30 tRNA genes and four rRNA genes. Comparative analysis of the overall nucleotide sequence identity among ten complete chloroplast genomes confirmed that for both coding and non-coding regions in Rosaceae, SC regions exhibit higher sequence variation than IRs. The Ka/Ks ratio of most genes was less than 1, suggesting that most genes are under purifying selection. Moreover, the mVISTA results also showed a high degree of conservation in genome structure, gene order and gene content in Fragaria, particularly among three octoploid strawberries which were F. × ananassa ‘Benihoppe’, F. chiloensis (GP33) and F. virginiana (O477). However, when the sequences of the coding and non-coding regions of F. × ananassa ‘Benihoppe’ were compared in detail with those of F. chiloensis (GP33) and F. virginiana (O477), a number of SNPs and InDels were revealed by MEGA 7. Six non-coding regions (trnK-matK, trnS-trnG, atpF-atpH, trnC-petN, trnT-psbD and trnP-psaJ) with a percentage of variable sites greater than 1% and no less than five parsimony-informative sites were identified and may be useful for phylogenetic analysis of the genus Fragaria. PMID:29038765
Kress, W John; Erickson, David L
2007-06-06
A useful DNA barcode requires sufficient sequence variation to distinguish between species and ease of application across a broad range of taxa. Discovery of a DNA barcode for land plants has been limited by intrinsically lower rates of sequence evolution in plant genomes than that observed in animals. This low rate has complicated the trade-off in finding a locus that is universal and readily sequenced and has sufficiently high sequence divergence at the species-level. Here, a global plant DNA barcode system is evaluated by comparing universal application and degree of sequence divergence for nine putative barcode loci, including coding and non-coding regions, singly and in pairs across a phylogenetically diverse set of 48 genera (two species per genus). No single locus could discriminate among species in a pair in more than 79% of genera, whereas discrimination increased to nearly 88% when the non-coding trnH-psbA spacer was paired with one of three coding loci, including rbcL. In silico trials were conducted in which DNA sequences from GenBank were used to further evaluate the discriminatory power of a subset of these loci. These trials supported the earlier observation that trnH-psbA coupled with rbcL can correctly identify and discriminate among related species. A combination of the non-coding trnH-psbA spacer region and a portion of the coding rbcL gene is recommended as a two-locus global land plant barcode that provides the necessary universality and species discrimination.
Ancient DNA sequence revealed by error-correcting codes.
Brandão, Marcelo M; Spoladore, Larissa; Faria, Luzinete C B; Rocha, Andréa S L; Silva-Filho, Marcio C; Palazzo, Reginaldo
2015-07-10
A previously described DNA sequence generator algorithm (DNA-SGA) using error-correcting codes has been employed as a computational tool to address the evolutionary pathway of the genetic code. The code-generated sequence alignment demonstrated that a residue mutation revealed by the code can be found in the same position in sequences of distantly related taxa. Furthermore, the code-generated sequences do not promote amino acid changes in the deviant genomes through codon reassignment. A Bayesian evolutionary analysis of both code-generated and homologous sequences of the Arabidopsis thaliana malate dehydrogenase gene indicates an approximately 1 MYA divergence time from the MDH code-generated sequence node to its paralogous sequences. The DNA-SGA helps to determine the plesiomorphic state of DNA sequences because a single nucleotide alteration often occurs in distantly related taxa and can be found in the alternative codon patterns of noncanonical genetic codes. As a consequence, the algorithm may reveal an earlier stage of the evolution of the standard code.
Ancient DNA sequence revealed by error-correcting codes
Brandão, Marcelo M.; Spoladore, Larissa; Faria, Luzinete C. B.; Rocha, Andréa S. L.; Silva-Filho, Marcio C.; Palazzo, Reginaldo
2015-01-01
A previously described DNA sequence generator algorithm (DNA-SGA) using error-correcting codes has been employed as a computational tool to address the evolutionary pathway of the genetic code. The code-generated sequence alignment demonstrated that a residue mutation revealed by the code can be found in the same position in sequences of distantly related taxa. Furthermore, the code-generated sequences do not promote amino acid changes in the deviant genomes through codon reassignment. A Bayesian evolutionary analysis of both code-generated and homologous sequences of the Arabidopsis thaliana malate dehydrogenase gene indicates an approximately 1 MYA divergence time from the MDH code-generated sequence node to its paralogous sequences. The DNA-SGA helps to determine the plesiomorphic state of DNA sequences because a single nucleotide alteration often occurs in distantly related taxa and can be found in the alternative codon patterns of noncanonical genetic codes. As a consequence, the algorithm may reveal an earlier stage of the evolution of the standard code. PMID:26159228
Nonuniform code concatenation for universal fault-tolerant quantum computing
NASA Astrophysics Data System (ADS)
Nikahd, Eesa; Sedighi, Mehdi; Saheb Zamani, Morteza
2017-09-01
Using transversal gates is a straightforward and efficient technique for fault-tolerant quantum computing. Since transversal gates alone cannot be computationally universal, they must be combined with other approaches such as magic state distillation, code switching, or code concatenation to achieve universality. In this paper we propose an alternative approach for universal fault-tolerant quantum computing, mainly based on the code concatenation approach proposed in [T. Jochym-O'Connor and R. Laflamme, Phys. Rev. Lett. 112, 010505 (2014), 10.1103/PhysRevLett.112.010505], but in a nonuniform fashion. The proposed approach is described based on nonuniform concatenation of the 7-qubit Steane code with the 15-qubit Reed-Muller code, as well as the 5-qubit code with the 15-qubit Reed-Muller code, which lead to two 49-qubit and 47-qubit codes, respectively. These codes can correct any arbitrary single physical error with the ability to perform a universal set of fault-tolerant gates, without using magic state distillation.
Gene duplication and divergence affecting drug content in Cannabis sativa.
Weiblen, George D; Wenger, Jonathan P; Craft, Kathleen J; ElSohly, Mahmoud A; Mehmedic, Zlatko; Treiber, Erin L; Marks, M David
2015-12-01
Cannabis sativa is an economically important source of durable fibers, nutritious seeds, and psychoactive drugs but few economic plants are so poorly understood genetically. Marijuana and hemp were crossed to evaluate competing models of cannabinoid inheritance and to explain the predominance of tetrahydrocannabinolic acid (THCA) in marijuana compared with cannabidiolic acid (CBDA) in hemp. Individuals in the resulting F2 population were assessed for differential expression of cannabinoid synthase genes and were used in linkage mapping. Genetic markers associated with divergent cannabinoid phenotypes were identified. Although phenotypic segregation and a major quantitative trait locus (QTL) for the THCA/CBDA ratio were consistent with a simple model of codominant alleles at a single locus, the diversity of THCA and CBDA synthase sequences observed in the mapping population, the position of enzyme coding loci on the map, and patterns of expression suggest multiple linked loci. Phylogenetic analysis further suggests a history of duplication and divergence affecting drug content. Marijuana is distinguished from hemp by a nonfunctional CBDA synthase that appears to have been positively selected to enhance psychoactivity. An unlinked QTL for cannabinoid quantity may also have played a role in the recent escalation of drug potency. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.
Improving P2P live-content delivery using SVC
NASA Astrophysics Data System (ADS)
Schierl, T.; Sánchez, Y.; Hellge, C.; Wiegand, T.
2010-07-01
P2P content delivery techniques for video transmission have become of high interest in the last years. With the involvement of client into the delivery process, P2P approaches can significantly reduce the load and cost on servers, especially for popular services. However, previous studies have already pointed out the unreliability of P2P-based live streaming approaches due to peer churn, where peers may ungracefully leave the P2P infrastructure, typically an overlay networks. Peers ungracefully leaving the system cause connection losses in the overlay, which require repair operations. During such repair operations, which typically take a few roundtrip times, no data is received from the lost connection. While taking low delay for fast-channel tune-in into account as a key feature for broadcast-like streaming applications, the P2P live streaming approach can only rely on a certain media pre-buffer during such repair operations. In this paper, multi-tree based Application Layer Multicast as a P2P overlay technique for live streaming is considered. The use of Flow Forwarding (FF), a.k.a. Retransmission, or Forward Error Correction (FEC) in combination with Scalable video Coding (SVC) for concealment during overlay repair operations is shown. Furthermore the benefits of using SVC over the use of AVC single layer transmission are presented.
An Analysis of Defense Information and Information Technology Articles: A Sixteen-Year Perspective
2009-03-01
exploratory,” or “subjective” ( Denzin & Lincoln , 2000). Existing Research This research is based on content analysis methodologies utilized by Carter...same codes ( Denzin & Lincoln , 2000). Different analysts should code the same text in a similar manner (Weber, 1990). Typically, researchers compute...chosen. Krippendorf recommends an agreement level of at least .70 (Krippendorff, 2004). Some scholars use a cut-off rate of .80 ( Denzin & Lincoln
A VLSI chip set for real time vector quantization of image sequences
NASA Technical Reports Server (NTRS)
Baker, Richard L.
1989-01-01
The architecture and implementation of a VLSI chip set that vector quantizes (VQ) image sequences in real time is described. The chip set forms a programmable Single-Instruction, Multiple-Data (SIMD) machine which can implement various vector quantization encoding structures. Its VQ codebook may contain unlimited number of codevectors, N, having dimension up to K = 64. Under a weighted least squared error criterion, the engine locates at video rates the best code vector in full-searched or large tree searched VQ codebooks. The ability to manipulate tree structured codebooks, coupled with parallelism and pipelining, permits searches in as short as O (log N) cycles. A full codebook search results in O(N) performance, compared to O(KN) for a Single-Instruction, Single-Data (SISD) machine. With this VLSI chip set, an entire video code can be built on a single board that permits realtime experimentation with very large codebooks.
NASA Technical Reports Server (NTRS)
Rost, Martin C.; Sayood, Khalid
1991-01-01
A method for efficiently coding natural images using a vector-quantized variable-blocksized transform source coder is presented. The method, mixture block coding (MBC), incorporates variable-rate coding by using a mixture of discrete cosine transform (DCT) source coders. Which coders are selected to code any given image region is made through a threshold driven distortion criterion. In this paper, MBC is used in two different applications. The base method is concerned with single-pass low-rate image data compression. The second is a natural extension of the base method which allows for low-rate progressive transmission (PT). Since the base method adapts easily to progressive coding, it offers the aesthetic advantage of progressive coding without incorporating extensive channel overhead. Image compression rates of approximately 0.5 bit/pel are demonstrated for both monochrome and color images.
Complete Chloroplast Genome Sequences of Important Oilseed Crop Sesamum indicum L
Yi, Dong-Keun; Kim, Ki-Joong
2012-01-01
Sesamum indicum is an important crop plant species for yielding oil. The complete chloroplast (cp) genome of S. indicum (GenBank acc no. JN637766) is 153,324 bp in length, and has a pair of inverted repeat (IR) regions consisting of 25,141 bp each. The lengths of the large single copy (LSC) and the small single copy (SSC) regions are 85,170 bp and 17,872 bp, respectively. Comparative cp DNA sequence analyses of S. indicum with other cp genomes reveal that the genome structure, gene order, gene and intron contents, AT contents, codon usage, and transcription units are similar to the typical angiosperm cp genomes. Nucleotide diversity of the IR region between Sesamum and three other cp genomes is much lower than that of the LSC and SSC regions in both the coding region and noncoding region. As a summary, the regional constraints strongly affect the sequence evolution of the cp genomes, while the functional constraints weakly affect the sequence evolution of cp genomes. Five short inversions associated with short palindromic sequences that form step-loop structures were observed in the chloroplast genome of S. indicum. Twenty-eight different simple sequence repeat loci have been detected in the chloroplast genome of S. indicum. Almost all of the SSR loci were composed of A or T, so this may also contribute to the A-T richness of the cp genome of S. indicum. Seven large repeated loci in the chloroplast genome of S. indicum were also identified and these loci are useful to developing S. indicum-specific cp genome vectors. The complete cp DNA sequences of S. indicum reported in this paper are prerequisite to modifying this important oilseed crop by cp genetic engineering techniques. PMID:22606240
The design plan of a VLSI single chip (255, 223) Reed-Solomon decoder
NASA Technical Reports Server (NTRS)
Hsu, I. S.; Shao, H. M.; Deutsch, L. J.
1987-01-01
The very large-scale integration (VLSI) architecture of a single chip (255, 223) Reed-Solomon decoder for decoding both errors and erasures is described. A decoding failure detection capability is also included in this system so that the decoder will recognize a failure to decode instead of introducing additional errors. This could happen whenever the received word contains too many errors and erasures for the code to correct. The number of transistors needed to implement this decoder is estimated at about 75,000 if the delay for received message is not included. This is in contrast to the older transform decoding algorithm which needs about 100,000 transistors. However, the transform decoder is simpler in architecture than the time decoder. It is therefore possible to implement a single chip (255, 223) Reed-Solomon decoder with today's VLSI technology. An implementation strategy for the decoder system is presented. This represents the first step in a plan to take advantage of advanced coding techniques to realize a 2.0 dB coding gain for future space missions.
NASA Technical Reports Server (NTRS)
Goldstein, David B.; Varghese, Philip L.
1997-01-01
We proposed to create a single computational code incorporating methods that can model both rarefied and continuum flow to enable the efficient simulation of flow about space craft and high altitude hypersonic aerospace vehicles. The code was to use a single grid structure that permits a smooth transition between the continuum and rarefied portions of the flow. Developing an appropriate computational boundary between the two regions represented a major challenge. The primary approach chosen involves coupling a four-speed Lattice Boltzmann model for the continuum flow with the DSMC method in the rarefied regime. We also explored the possibility of using a standard finite difference Navier Stokes solver for the continuum flow. With the resulting code we will ultimately investigate three-dimensional plume impingement effects, a subject of critical importance to NASA and related to the work of Drs. Forrest Lumpkin, Steve Fitzgerald and Jay Le Beau at Johnson Space Center. Below is a brief background on the project and a summary of the results as of the end of the grant.
Crespo, Alejandro C.; Dominguez, Jose M.; Barreiro, Anxo; Gómez-Gesteira, Moncho; Rogers, Benedict D.
2011-01-01
Smoothed Particle Hydrodynamics (SPH) is a numerical method commonly used in Computational Fluid Dynamics (CFD) to simulate complex free-surface flows. Simulations with this mesh-free particle method far exceed the capacity of a single processor. In this paper, as part of a dual-functioning code for either central processing units (CPUs) or Graphics Processor Units (GPUs), a parallelisation using GPUs is presented. The GPU parallelisation technique uses the Compute Unified Device Architecture (CUDA) of nVidia devices. Simulations with more than one million particles on a single GPU card exhibit speedups of up to two orders of magnitude over using a single-core CPU. It is demonstrated that the code achieves different speedups with different CUDA-enabled GPUs. The numerical behaviour of the SPH code is validated with a standard benchmark test case of dam break flow impacting on an obstacle where good agreement with the experimental results is observed. Both the achieved speed-ups and the quantitative agreement with experiments suggest that CUDA-based GPU programming can be used in SPH methods with efficiency and reliability. PMID:21695185
Effect of a Significant Other on Client Change Talk in Motivational Interviewing
ERIC Educational Resources Information Center
Apodaca, Timothy R.; Magill, Molly; Longabaugh, Richard; Jackson, Kristina M.; Monti, Peter M.
2013-01-01
Objective:To examine significant-other (SO) and therapist behaviors as predictors of client change language within motivational interviewing (MI) sessions. Method: Participants from an emergency department received a single session of MI that included SO participation (N = 157). Sessions were coded using therapy process coding systems. Sessions…
Structured FORTRAN Preprocessor
NASA Technical Reports Server (NTRS)
Flynn, J. A.; Lawson, C. L.; Van Snyder, W.; Tsitsivas, H. N.
1985-01-01
SFTRAN3 supports structured programing in FORTRAN environment. Language intended particularly to support two aspects of structured programing -- nestable single-entry control structures and modularization and top-down organization of code. Code designed and written using these SFTRAN3 facilities have fewer initial errors, easier to understand and less expensive to maintain and modify.
Preissl, Sebastian; Fang, Rongxin; Huang, Hui; Zhao, Yuan; Raviram, Ramya; Gorkin, David U; Zhang, Yanxiao; Sos, Brandon C; Afzal, Veena; Dickel, Diane E; Kuan, Samantha; Visel, Axel; Pennacchio, Len A; Zhang, Kun; Ren, Bing
2018-03-01
In the version of this article initially published online, the accession code was given as GSE1000333. The correct code is GSE100033. The error has been corrected in the print, HTML and PDF versions of the article.
Code and codeless ionospheric measurements with NASA's Rogue GPS Receiver
NASA Technical Reports Server (NTRS)
Srinivasan, Jeff M.; Meehan, Tom K.; Young, Lawrence E.
1989-01-01
The NASA/JPL Rogue Receiver is an 8-satellite, non-multiplexed, highly digital global positioning system (GPS) receiver that can obtain dual frequency data either with or without knowledge of the P-code. In addition to its applications for high accuracy geodesy and orbit determination, the Rogue uses GPS satellite signals to measure the total electron content (TEC) of the ionosphere along the lines of sight from the receiver to the satellites. These measurements are used by JPL's Deep Space Network (DSN) for calibrating radiometric data. This paper will discuss Rogue TEC measurements, emphasizing the advantages of a receiver that can use the P-code, when available, but can also obtain reliable dual frequency data when the code is encrypted.
ERIC Educational Resources Information Center
Campbell, Rebecca P.; Saltonstall, Margot; Buford, Betsy
2013-01-01
In recognition of 25th anniversary of the "Journal of The First-Year Experience & Students in Transition" (the "Journal"), a content analysis and review of "Journal" citations in other works was conducted. A team of three researchers coded type of research, type of intervention, target population, and topics of…
Enacting FAIR Education: Approaches to Integrating LGBT Content in the K-12 Curriculum
ERIC Educational Resources Information Center
Vecellio, Shawn
2012-01-01
The FAIR Education Act (SB 48) was signed into law in California in July of 2011, amending the Education Code by requiring representation of lesbian, gay, bisexual, and transgender persons in the social sciences. In this article, the author uses James Banks' model of the Four Levels of Integration of Multicultural Content to suggest ways in which…
Disability in Physical Education Textbooks: An Analysis of Image Content
ERIC Educational Resources Information Center
Taboas-Pais, Maria Ines; Rey-Cao, Ana
2012-01-01
The aim of this paper is to show how images of disability are portrayed in physical education textbooks for secondary schools in Spain. The sample was composed of 3,316 images published in 36 textbooks by 10 publishing houses. A content analysis was carried out using a coding scheme based on categories employed in other similar studies and adapted…
ERIC Educational Resources Information Center
Woo, Hongryun; Mulit, Cynthia J.; Visalli, Kelsea M.
2016-01-01
Counselor Education (CE) program websites play a role in program fit by helping prospective students learn about the profession, search for programs and apply for admission. Using the 2014 "ACA Code of Ethics'" nine categories of orientation content as its framework, this study explored the information provided on the 63…
ERIC Educational Resources Information Center
Kang, Namjun
If content analysis is to satisfy the requirement of objectivity, measures and procedures must be reliable. Reliability is usually measured by the proportion of agreement of all categories identically coded by different coders. For such data to be empirically meaningful, a high degree of inter-coder reliability must be demonstrated. Researchers in…
Yatawara, Lalani; Wickramasinghe, Susiji; Rajapakse, R P V J; Agatsuma, Takeshi
2010-09-01
In the present study, we determined the complete mitochondrial (mt) genome sequence (13,839bp) of parasitic nematode Setaria digitata and its structure and organization compared with Onchocerca volvulus, Dirofilaria immitis and Brugia malayi. The mt genome of S. digitata is slightly larger than the mt genomes of other filarial nematodes. S. digitata mt genome contains 36 genes (12 protein-coding genes, 22 transfer RNAs and 2 ribosomal RNAs) that are typically found in metazoans. This genome contains a high A+T (75.1%) content and low G+C content (24.9%). The mt gene order for S. digitata is the same as those for O. volvulus, D. immitis and B. malayi but it is distinctly different from other nematodes compared. The start codons inferred in the mt genome of S. digitata are TTT, ATT, TTG, ATG, GTT and ATA. Interestingly, the initiation codon TTT is unique to S. digitata mt genome and four protein-coding genes use this codon as a translation initiation codon. Five protein-coding genes use TAG as a stop codon whereas three genes use TAA and four genes use T as a termination codon. Out of 64 possible codons, only 57 are used for mitochondrial protein-coding genes of S. digitata. T-rich codons such as TTT (18.9%), GTT (7.9%), TTG (7.8%), TAT (7%), ATT (5.7%), TCT (4.8%) and TTA (4.1%) are used more frequently. This pattern of codon usage reflects the strong bias for T in the mt genome of S. digitata. In conclusion, the present investigation provides new molecular data for future studies of the comparative mitochondrial genomics and systematic of parasitic nematodes of socio-economic importance. 2010 Elsevier B.V. All rights reserved.
Ogneva, I V; Maximova, M V; Larina, I M
2014-01-01
The aim of this study was to determine the transversal stiffness of the cortical cytoskeleton and the cytoskeletal protein desmin content in the left ventricle cardiomyocytes, fibers of the mouse soleus and tibialis anterior muscle after a 30-day space flight on board the "BION-M1" biosatellite (Russia, 2013). The dissection was made after 13-16.5 h after landing. The transversal stiffness was measured in relaxed and calcium activated state by, atomic force microscopy. The desmin content was estimated by western blotting, and the expression level of desmin-coding gene was detected using real-time PCR. The results indicate that, the transversal stiffness of the left ventricle cardiomyocytes and fibers of the soleus muscle in relaxed and activated states did not differ from the control. The transversal stiffness of the tibialis muscle fibers in relaxed and activated state was increased in the mice group after space flight. At the same time, in all types of studied tissues the desmin content and the expression level of desmin-coding gene did not differ from the control level.