Reducing the genetic code induces massive rearrangement of the proteome
O’Donoghue, Patrick; Prat, Laure; Kucklick, Martin; Schäfer, Johannes G.; Riedel, Katharina; Rinehart, Jesse; Söll, Dieter; Heinemann, Ilka U.
2014-01-01
Expanding the genetic code is an important aim of synthetic biology, but some organisms developed naturally expanded genetic codes long ago over the course of evolution. Less than 1% of all sequenced genomes encode an operon that reassigns the stop codon UAG to pyrrolysine (Pyl), a genetic code variant that results from the biosynthesis of Pyl-tRNAPyl. To understand the selective advantage of genetically encoding more than 20 amino acids, we constructed a markerless tRNAPyl deletion strain of Methanosarcina acetivorans (ΔpylT) that cannot decode UAG as Pyl or grow on trimethylamine. Phenotypic defects in the ΔpylT strain were evident in minimal medium containing methanol. Proteomic analyses of wild type (WT) M. acetivorans and ΔpylT cells identified 841 proteins from >7,000 significant peptides detected by MS/MS. Protein production from UAG-containing mRNAs was verified for 19 proteins. Translation of UAG codons was verified by MS/MS for eight proteins, including identification of a Pyl residue in PylB, which catalyzes the first step of Pyl biosynthesis. Deletion of tRNAPyl globally altered the proteome, leading to >300 differentially abundant proteins. Reduction of the genetic code from 21 to 20 amino acids led to significant down-regulation in translation initiation factors, amino acid metabolism, and methanogenesis from methanol, which was offset by a compensatory (100-fold) up-regulation in dimethyl sulfide metabolic enzymes. The data show how a natural proteome adapts to genetic code reduction and indicate that the selective value of an expanded genetic code is related to carbon source range and metabolic efficiency. PMID:25404328
Abaka, Gamze; Bıyıkoğlu, Türker; Erten, Cesim
2013-07-01
Given a pair of metabolic pathways, an alignment of the pathways corresponds to a mapping between similar substructures of the pair. Successful alignments may provide useful applications in phylogenetic tree reconstruction, drug design and overall may enhance our understanding of cellular metabolism. We consider the problem of providing one-to-many alignments of reactions in a pair of metabolic pathways. We first provide a constrained alignment framework applicable to the problem. We show that the constrained alignment problem even in a primitive setting is computationally intractable, which justifies efforts for designing efficient heuristics. We present our Constrained Alignment of Metabolic Pathways (CAMPways) algorithm designed for this purpose. Through extensive experiments involving a large pathway database, we demonstrate that when compared with a state-of-the-art alternative, the CAMPways algorithm provides better alignment results on metabolic networks as far as measures based on same-pathway inclusion and biochemical significance are concerned. The execution speed of our algorithm constitutes yet another important improvement over alternative algorithms. Open source codes, executable binary, useful scripts, all the experimental data and the results are freely available as part of the Supplementary Material at http://code.google.com/p/campways/. Supplementary data are available at Bioinformatics online.
Kostal, Lubomir; Kobayashi, Ryota
2015-10-01
Information theory quantifies the ultimate limits on reliable information transfer by means of the channel capacity. However, the channel capacity is known to be an asymptotic quantity, assuming unlimited metabolic cost and computational power. We investigate a single-compartment Hodgkin-Huxley type neuronal model under the spike-rate coding scheme and address how the metabolic cost and the decoding complexity affects the optimal information transmission. We find that the sub-threshold stimulation regime, although attaining the smallest capacity, allows for the most efficient balance between the information transmission and the metabolic cost. Furthermore, we determine post-synaptic firing rate histograms that are optimal from the information-theoretic point of view, which enables the comparison of our results with experimental data. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Energy-efficient neural information processing in individual neurons and neuronal networks.
Yu, Lianchun; Yu, Yuguo
2017-11-01
Brains are composed of networks of an enormous number of neurons interconnected with synapses. Neural information is carried by the electrical signals within neurons and the chemical signals among neurons. Generating these electrical and chemical signals is metabolically expensive. The fundamental issue raised here is whether brains have evolved efficient ways of developing an energy-efficient neural code from the molecular level to the circuit level. Here, we summarize the factors and biophysical mechanisms that could contribute to the energy-efficient neural code for processing input signals. The factors range from ion channel kinetics, body temperature, axonal propagation of action potentials, low-probability release of synaptic neurotransmitters, optimal input and noise, the size of neurons and neuronal clusters, excitation/inhibition balance, coding strategy, cortical wiring, and the organization of functional connectivity. Both experimental and computational evidence suggests that neural systems may use these factors to maximize the efficiency of energy consumption in processing neural signals. Studies indicate that efficient energy utilization may be universal in neuronal systems as an evolutionary consequence of the pressure of limited energy. As a result, neuronal connections may be wired in a highly economical manner to lower energy costs and space. Individual neurons within a network may encode independent stimulus components to allow a minimal number of neurons to represent whole stimulus characteristics efficiently. This basic principle may fundamentally change our view of how billions of neurons organize themselves into complex circuits to operate and generate the most powerful intelligent cognition in nature. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Long non-coding RNAs in cancer metabolism.
Xiao, Zhen-Dong; Zhuang, Li; Gan, Boyi
2016-10-01
Altered cellular metabolism is an emerging hallmark of cancer. Accumulating recent evidence links long non-coding RNAs (lncRNAs), a still poorly understood class of non-coding RNAs, to cancer metabolism. Here we review the emerging findings on the functions of lncRNAs in cancer metabolism, with particular emphasis on how lncRNAs regulate glucose and glutamine metabolism in cancer cells, discuss how lncRNAs regulate various aspects of cancer metabolism through their cross-talk with other macromolecules, explore the mechanistic conceptual framework of lncRNAs in reprogramming metabolism in cancers, and highlight the challenges in this field. A more in-depth understanding of lncRNAs in cancer metabolism may enable the development of novel and effective therapeutic strategies targeting cancer metabolism. © 2016 WILEY Periodicals, Inc.
Giovannelli, Donato; Sievert, Stefan M; Hügler, Michael; Markert, Stephanie; Becher, Dörte; Schweder, Thomas; Vetriani, Costantino
2017-04-24
Anaerobic thermophiles inhabit relic environments that resemble the early Earth. However, the lineage of these modern organisms co-evolved with our planet. Hence, these organisms carry both ancestral and acquired genes and serve as models to reconstruct early metabolism. Based on comparative genomic and proteomic analyses, we identified two distinct groups of genes in Thermovibrio ammonificans : the first codes for enzymes that do not require oxygen and use substrates of geothermal origin; the second appears to be a more recent acquisition, and may reflect adaptations to cope with the rise of oxygen on Earth. We propose that the ancestor of the Aquificae was originally a hydrogen oxidizing, sulfur reducing bacterium that used a hybrid pathway for CO 2 fixation. With the gradual rise of oxygen in the atmosphere, more efficient terminal electron acceptors became available and this lineage acquired genes that increased its metabolic flexibility while retaining ancestral metabolic traits.
Recon2Neo4j: applying graph database technologies for managing comprehensive genome-scale networks.
Balaur, Irina; Mazein, Alexander; Saqi, Mansoor; Lysenko, Artem; Rawlings, Christopher J; Auffray, Charles
2017-04-01
The goal of this work is to offer a computational framework for exploring data from the Recon2 human metabolic reconstruction model. Advanced user access features have been developed using the Neo4j graph database technology and this paper describes key features such as efficient management of the network data, examples of the network querying for addressing particular tasks, and how query results are converted back to the Systems Biology Markup Language (SBML) standard format. The Neo4j-based metabolic framework facilitates exploration of highly connected and comprehensive human metabolic data and identification of metabolic subnetworks of interest. A Java-based parser component has been developed to convert query results (available in the JSON format) into SBML and SIF formats in order to facilitate further results exploration, enhancement or network sharing. The Neo4j-based metabolic framework is freely available from: https://diseaseknowledgebase.etriks.org/metabolic/browser/ . The java code files developed for this work are available from the following url: https://github.com/ibalaur/MetabolicFramework . ibalaur@eisbm.org. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Recon2Neo4j: applying graph database technologies for managing comprehensive genome-scale networks
Mazein, Alexander; Saqi, Mansoor; Lysenko, Artem; Rawlings, Christopher J.; Auffray, Charles
2017-01-01
Abstract Summary: The goal of this work is to offer a computational framework for exploring data from the Recon2 human metabolic reconstruction model. Advanced user access features have been developed using the Neo4j graph database technology and this paper describes key features such as efficient management of the network data, examples of the network querying for addressing particular tasks, and how query results are converted back to the Systems Biology Markup Language (SBML) standard format. The Neo4j-based metabolic framework facilitates exploration of highly connected and comprehensive human metabolic data and identification of metabolic subnetworks of interest. A Java-based parser component has been developed to convert query results (available in the JSON format) into SBML and SIF formats in order to facilitate further results exploration, enhancement or network sharing. Availability and Implementation: The Neo4j-based metabolic framework is freely available from: https://diseaseknowledgebase.etriks.org/metabolic/browser/. The java code files developed for this work are available from the following url: https://github.com/ibalaur/MetabolicFramework. Contact: ibalaur@eisbm.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27993779
Drug metabolism and hypersensitivity reactions to drugs.
Agúndez, José A G; Mayorga, Cristobalina; García-Martin, Elena
2015-08-01
The aim of the present review was to discuss recent advances supporting a role of drug metabolism, and particularly of the generation of reactive metabolites, in hypersensitivity reactions to drugs. The development of novel mass-spectrometry procedures has allowed the identification of reactive metabolites from drugs known to be involved in hypersensitivity reactions, including amoxicillin and nonsteroidal antiinflammatory drugs such as aspirin, diclofenac or metamizole. Recent studies demonstrated that reactive metabolites may efficiently bind plasma proteins, thus suggesting that drug metabolites, rather than - or in addition to - parent drugs, may elicit an immune response. As drug metabolic profiles are often determined by variability in the genes coding for drug-metabolizing enzymes, it is conceivable that an altered drug metabolism may predispose to the generation of reactive drug metabolites and hence to hypersensitivity reactions. These findings support the potential for the use of pharmacogenomics tests in hypersensitivity (type B) adverse reactions, in addition to the well known utility of these tests in type A adverse reactions. Growing evidence supports a link between genetically determined drug metabolism, altered metabolic profiles, generation of highly reactive metabolites and haptenization. Additional research is required to developing robust biomarkers for drug-induced hypersensitivity reactions.
Giovannelli, Donato; Sievert, Stefan M; Hügler, Michael; Markert, Stephanie; Becher, Dörte; Schweder, Thomas; Vetriani, Costantino
2017-01-01
Anaerobic thermophiles inhabit relic environments that resemble the early Earth. However, the lineage of these modern organisms co-evolved with our planet. Hence, these organisms carry both ancestral and acquired genes and serve as models to reconstruct early metabolism. Based on comparative genomic and proteomic analyses, we identified two distinct groups of genes in Thermovibrio ammonificans: the first codes for enzymes that do not require oxygen and use substrates of geothermal origin; the second appears to be a more recent acquisition, and may reflect adaptations to cope with the rise of oxygen on Earth. We propose that the ancestor of the Aquificae was originally a hydrogen oxidizing, sulfur reducing bacterium that used a hybrid pathway for CO2 fixation. With the gradual rise of oxygen in the atmosphere, more efficient terminal electron acceptors became available and this lineage acquired genes that increased its metabolic flexibility while retaining ancestral metabolic traits. DOI: http://dx.doi.org/10.7554/eLife.18990.001 PMID:28436819
Liu, Baodong; Liu, Xiaoling; Lai, Weiyi; Wang, Hailin
2017-06-06
DNA N 6 -methyl-2'-deoxyadenosine (6mdA) is an epigenetic modification in both eukaryotes and bacteria. Here we exploited stable isotope-labeled deoxynucleoside [ 15 N 5 ]-2'-deoxyadenosine ([ 15 N 5 ]-dA) as an initiation tracer and for the first time developed a metabolically differential tracing code for monitoring DNA 6mdA in human cells. We demonstrate that the initiation tracer [ 15 N 5 ]-dA undergoes a specific and efficient adenine deamination reaction leading to the loss the exocyclic amine 15 N, and further utilizes the purine salvage pathway to generate mainly both [ 15 N 4 ]-dA and [ 15 N 4 ]-2'-deoxyguanosine ([ 15 N 4 ]-dG) in mammalian genomes. However, [ 15 N 5 ]-dA is largely retained in the genomes of mycoplasmas, which are often found in cultured cells and experimental animals. Consequently, the methylation of dA generates 6mdA with a consistent coding pattern, with a predominance of [ 15 N 4 ]-6mdA. Therefore, mammalian DNA 6mdA can be potentially discriminated from that generated by infecting mycoplasmas. Collectively, we show a promising approach for identification of authentic DNA 6mdA in human cells and determine if the human cells are contaminated with mycoplasmas.
Complete genome sequence of Paenibacillus sp. strain JDR-2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chow, Virginia; Nong, Guang; St. John, Franz J.
2012-01-01
Paenibacillus sp. strain JDR-2, an aggressively xylanolytic bacterium isolated from sweetgum (Liquidambar styraciflua) wood, is able to efficiently depolymerize, assimilate and metabolize 4-O-methylglucuronoxylan, the predominant structural component of hardwood hemicelluloses. A basis for this capability was first supported by the identification of genes and characterization of encoded enzymes and has been further defined by the sequencing and annotation of the complete genome, which we describe. In addition to genes implicated in the utilization of -1,4-xylan, genes have also been identified for the utilization of other hemicellulosic polysaccharides. The genome of Paenibacillus sp. JDR-2 contains 7,184,930 bp in a single repliconmore » with 6,288 protein-coding and 122 RNA genes. Uniquely prominent are 874 genes encoding proteins involved in carbohydrate transport and metabolism. The prevalence and organization of these genes support a metabolic potential for bioprocessing of hemicellulose fractions derived from lignocellulosic resources.« less
Saa, Pedro A.; Nielsen, Lars K.
2016-01-01
Motivation: Computation of steady-state flux solutions in large metabolic models is routinely performed using flux balance analysis based on a simple LP (Linear Programming) formulation. A minimal requirement for thermodynamic feasibility of the flux solution is the absence of internal loops, which are enforced using ‘loopless constraints’. The resulting loopless flux problem is a substantially harder MILP (Mixed Integer Linear Programming) problem, which is computationally expensive for large metabolic models. Results: We developed a pre-processing algorithm that significantly reduces the size of the original loopless problem into an easier and equivalent MILP problem. The pre-processing step employs a fast matrix sparsification algorithm—Fast- sparse null-space pursuit (SNP)—inspired by recent results on SNP. By finding a reduced feasible ‘loop-law’ matrix subject to known directionalities, Fast-SNP considerably improves the computational efficiency in several metabolic models running different loopless optimization problems. Furthermore, analysis of the topology encoded in the reduced loop matrix enabled identification of key directional constraints for the potential permanent elimination of infeasible loops in the underlying model. Overall, Fast-SNP is an effective and simple algorithm for efficient formulation of loop-law constraints, making loopless flux optimization feasible and numerically tractable at large scale. Availability and Implementation: Source code for MATLAB including examples is freely available for download at http://www.aibn.uq.edu.au/cssb-resources under Software. Optimization uses Gurobi, CPLEX or GLPK (the latter is included with the algorithm). Contact: lars.nielsen@uq.edu.au Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27559155
Current knowledge of microRNA-mediated regulation of drug metabolism in humans.
Nakano, Masataka; Nakajima, Miki
2018-05-01
Understanding the factors causing inter- and intra-individual differences in drug metabolism potencies is required for the practice of personalized or precision medicine, as well as for the promotion of efficient drug development. The expression of drug-metabolizing enzymes is controlled by transcriptional regulation by nuclear receptors and transcriptional factors, epigenetic regulation, such as DNA methylation and histone acetylation, and post-translational modification. In addition to such regulation mechanisms, recent studies revealed that microRNAs (miRNAs), endogenous ~22-nucleotide non-coding RNAs that regulate gene expression through the translational repression and degradation of mRNAs, significantly contribute to post-transcriptional regulation of drug-metabolizing enzymes. Areas covered: This review summarizes the current knowledge regarding miRNAs-dependent regulation of drug-metabolizing enzymes and transcriptional factors and its physiological and clinical significance. We also describe recent advances in miRNA-dependent regulation research, showing that the presence of pseudogenes, single-nucleotide polymorphisms, and RNA editing affects miRNA targeting. Expert opinion: It is unwavering fact that miRNAs are critical factors causing inter- and intra-individual differences in the expression of drug-metabolizing enzymes. Consideration of miRNA-dependent regulation would be a helpful tool for optimizing personalized and precision medicine.
Albrechtsen, A; Grarup, N; Li, Y; Sparsø, T; Tian, G; Cao, H; Jiang, T; Kim, S Y; Korneliussen, T; Li, Q; Nie, C; Wu, R; Skotte, L; Morris, A P; Ladenvall, C; Cauchi, S; Stančáková, A; Andersen, G; Astrup, A; Banasik, K; Bennett, A J; Bolund, L; Charpentier, G; Chen, Y; Dekker, J M; Doney, A S F; Dorkhan, M; Forsen, T; Frayling, T M; Groves, C J; Gui, Y; Hallmans, G; Hattersley, A T; He, K; Hitman, G A; Holmkvist, J; Huang, S; Jiang, H; Jin, X; Justesen, J M; Kristiansen, K; Kuusisto, J; Lajer, M; Lantieri, O; Li, W; Liang, H; Liao, Q; Liu, X; Ma, T; Ma, X; Manijak, M P; Marre, M; Mokrosiński, J; Morris, A D; Mu, B; Nielsen, A A; Nijpels, G; Nilsson, P; Palmer, C N A; Rayner, N W; Renström, F; Ribel-Madsen, R; Robertson, N; Rolandsson, O; Rossing, P; Schwartz, T W; Slagboom, P E; Sterner, M; Tang, M; Tarnow, L; Tuomi, T; van't Riet, E; van Leeuwen, N; Varga, T V; Vestmar, M A; Walker, M; Wang, B; Wang, Y; Wu, H; Xi, F; Yengo, L; Yu, C; Zhang, X; Zhang, J; Zhang, Q; Zhang, W; Zheng, H; Zhou, Y; Altshuler, D; 't Hart, L M; Franks, P W; Balkau, B; Froguel, P; McCarthy, M I; Laakso, M; Groop, L; Christensen, C; Brandslund, I; Lauritzen, T; Witte, D R; Linneberg, A; Jørgensen, T; Hansen, T; Wang, J; Nielsen, R; Pedersen, O
2013-02-01
Human complex metabolic traits are in part regulated by genetic determinants. Here we applied exome sequencing to identify novel associations of coding polymorphisms at minor allele frequencies (MAFs) >1% with common metabolic phenotypes. The study comprised three stages. We performed medium-depth (8×) whole exome sequencing in 1,000 cases with type 2 diabetes, BMI >27.5 kg/m(2) and hypertension and in 1,000 controls (stage 1). We selected 16,192 polymorphisms nominally associated (p < 0.05) with case-control status, from four selected annotation categories or from loci reported to associate with metabolic traits. These variants were genotyped in 15,989 Danes to search for association with 12 metabolic phenotypes (stage 2). In stage 3, polymorphisms showing potential associations were genotyped in a further 63,896 Europeans. Exome sequencing identified 70,182 polymorphisms with MAF >1%. In stage 2 we identified 51 potential associations with one or more of eight metabolic phenotypes covered by 45 unique polymorphisms. In meta-analyses of stage 2 and stage 3 results, we demonstrated robust associations for coding polymorphisms in CD300LG (fasting HDL-cholesterol: MAF 3.5%, p = 8.5 × 10(-14)), COBLL1 (type 2 diabetes: MAF 12.5%, OR 0.88, p = 1.2 × 10(-11)) and MACF1 (type 2 diabetes: MAF 23.4%, OR 1.10, p = 8.2 × 10(-10)). We applied exome sequencing as a basis for finding genetic determinants of metabolic traits and show the existence of low-frequency and common coding polymorphisms with impact on common metabolic traits. Based on our study, coding polymorphisms with MAF above 1% do not seem to have particularly high effect sizes on the measured metabolic traits.
Sollie, Annet; Sijmons, Rolf H; Lindhout, Dick; van der Ploeg, Ans T; Rubio Gozalbo, M Estela; Smit, G Peter A; Verheijen, Frans; Waterham, Hans R; van Weely, Sonja; Wijburg, Frits A; Wijburg, Rudolph; Visser, Gepke
2013-07-01
Data sharing is essential for a better understanding of genetic disorders. Good phenotype coding plays a key role in this process. Unfortunately, the two most widely used coding systems in medicine, ICD-10 and SNOMED-CT, lack information necessary for the detailed classification and annotation of rare and genetic disorders. This prevents the optimal registration of such patients in databases and thus data-sharing efforts. To improve care and to facilitate research for patients with metabolic disorders, we developed a new coding system for metabolic diseases with a dedicated group of clinical specialists. Next, we compared the resulting codes with those in ICD and SNOMED-CT. No matches were found in 76% of cases in ICD-10 and in 54% in SNOMED-CT. We conclude that there are sizable gaps in the SNOMED-CT and ICD coding systems for metabolic disorders. There may be similar gaps for other classes of rare and genetic disorders. We have demonstrated that expert groups can help in addressing such coding issues. Our coding system has been made available to the ICD and SNOMED-CT organizations as well as to the Orphanet and HPO organizations for further public application and updates will be published online (www.ddrmd.nl and www.cineas.org). © 2013 WILEY PERIODICALS, INC.
Silencing of the pentose phosphate pathway genes influences DNA replication in human fibroblasts.
Fornalewicz, Karolina; Wieczorek, Aneta; Węgrzyn, Grzegorz; Łyżeń, Robert
2017-11-30
Previous reports and our recently published data indicated that some enzymes of glycolysis and the tricarboxylic acid cycle can affect the genome replication process by changing either the efficiency or timing of DNA synthesis in human normal cells. Both these pathways are connected with the pentose phosphate pathway (PPP pathway). The PPP pathway supports cell growth by generating energy and precursors for nucleotides and amino acids. Therefore, we asked if silencing of genes coding for enzymes involved in the pentose phosphate pathway may also affect the control of DNA replication in human fibroblasts. Particular genes coding for PPP pathway enzymes were partially silenced with specific siRNAs. Such cells remained viable. We found that silencing of the H6PD, PRPS1, RPE genes caused less efficient enterance to the S phase and decrease in efficiency of DNA synthesis. On the other hand, in cells treated with siRNA against G6PD, RBKS and TALDO genes, the fraction of cells entering the S phase was increased. However, only in the case of G6PD and TALDO, the ratio of BrdU incorporation to DNA was significantly changed. The presented results together with our previously published studies illustrate the complexity of the influence of genes coding for central carbon metabolism on the control of DNA replication in human fibroblasts, and indicate which of them are especially important in this process. Copyright © 2017 Elsevier B.V. All rights reserved.
Long Noncoding RNAs: a New Regulatory Code in Metabolic Control
Zhao, Xu-Yun; Lin, Jiandie D.
2015-01-01
Long noncoding RNAs (lncRNAs) are emerging as an integral part of the regulatory information encoded in the genome. LncRNAs possess the unique capability to interact with nucleic acids and proteins and exert discrete effects on numerous biological processes. Recent studies have delineated multiple lncRNA pathways that control metabolic tissue development and function. The expansion of the regulatory code that links nutrient and hormonal signals to tissue metabolism gives new insights into the genetic and pathogenic mechanisms underlying metabolic disease. This review discusses lncRNA biology with a focus on its role in the development, signaling, and function of key metabolic tissues. PMID:26410599
Thuan, Nguyen Huy; Dhakal, Dipesh; Pokhrel, Anaya Raj; Chu, Luan Luong; Van Pham, Thi Thuy; Shrestha, Anil; Sohng, Jae Kyung
2018-05-01
Streptomyces peucetius ATCC 27952 produces two major anthracyclines, doxorubicin (DXR) and daunorubicin (DNR), which are potent chemotherapeutic agents for the treatment of several cancers. In order to gain detailed insight on genetics and biochemistry of the strain, the complete genome was determined and analyzed. The result showed that its complete sequence contains 7187 protein coding genes in a total of 8,023,114 bp, whereas 87% of the genome contributed to the protein coding region. The genomic sequence included 18 rRNA, 66 tRNAs, and 3 non-coding RNAs. In silico studies predicted ~ 68 biosynthetic gene clusters (BCGs) encoding diverse classes of secondary metabolites, including non-ribosomal polyketide synthase (NRPS), polyketide synthase (PKS I, II, and III), terpenes, and others. Detailed analysis of the genome sequence revealed versatile biocatalytic enzymes such as cytochrome P450 (CYP), electron transfer systems (ETS) genes, methyltransferase (MT), glycosyltransferase (GT). In addition, numerous functional genes (transporter gene, SOD, etc.) and regulatory genes (afsR-sp, metK-sp, etc.) involved in the regulation of secondary metabolites were found. This minireview summarizes the genome-based genome mining (GM) of diverse BCGs and genome exploration (GE) of versatile biocatalytic enzymes, and other enzymes involved in maintenance and regulation of metabolism of S. peucetius. The detailed analysis of genome sequence provides critically important knowledge useful in the bioengineering of the strain or harboring catalytically efficient enzymes for biotechnological applications.
Natural selection drove metabolic specialization of the chromatophore in Paulinella chromatophora.
Valadez-Cano, Cecilio; Olivares-Hernández, Roberto; Resendis-Antonio, Osbaldo; DeLuna, Alexander; Delaye, Luis
2017-04-14
Genome degradation of host-restricted mutualistic endosymbionts has been attributed to inactivating mutations and genetic drift while genes coding for host-relevant functions are conserved by purifying selection. Unlike their free-living relatives, the metabolism of mutualistic endosymbionts and endosymbiont-originated organelles is specialized in the production of metabolites which are released to the host. This specialization suggests that natural selection crafted these metabolic adaptations. In this work, we analyzed the evolution of the metabolism of the chromatophore of Paulinella chromatophora by in silico modeling. We asked whether genome reduction is driven by metabolic engineering strategies resulted from the interaction with the host. As its widely known, the loss of enzyme coding genes leads to metabolic network restructuring sometimes improving the production rates. In this case, the production rate of reduced-carbon in the metabolism of the chromatophore. We reconstructed the metabolic networks of the chromatophore of P. chromatophora CCAC 0185 and a close free-living relative, the cyanobacterium Synechococcus sp. WH 5701. We found that the evolution of free-living to host-restricted lifestyle rendered a fragile metabolic network where >80% of genes in the chromatophore are essential for metabolic functionality. Despite the lack of experimental information, the metabolic reconstruction of the chromatophore suggests that the host provides several metabolites to the endosymbiont. By using these metabolites as intracellular conditions, in silico simulations of genome evolution by gene lose recover with 77% accuracy the actual metabolic gene content of the chromatophore. Also, the metabolic model of the chromatophore allowed us to predict by flux balance analysis a maximum rate of reduced-carbon released by the endosymbiont to the host. By inspecting the central metabolism of the chromatophore and the free-living cyanobacteria we found that by improvements in the gluconeogenic pathway the metabolism of the endosymbiont uses more efficiently the carbon source for reduced-carbon production. In addition, our in silico simulations of the evolutionary process leading to the reduced metabolic network of the chromatophore showed that the predicted rate of released reduced-carbon is obtained in less than 5% of the times under a process guided by random gene deletion and genetic drift. We interpret previous findings as evidence that natural selection at holobiont level shaped the rate at which reduced-carbon is exported to the host. Finally, our model also predicts that the ABC phosphate transporter (pstSACB) which is conserved in the genome of the chromatophore of P. chromatophora strain CCAC 0185 is a necessary component to release reduced-carbon molecules to the host. Our evolutionary analysis suggests that in the case of Paulinella chromatophora natural selection at the holobiont level played a prominent role in shaping the metabolic specialization of the chromatophore. We propose that natural selection acted as a "metabolic engineer" by favoring metabolic restructurings that led to an increased release of reduced-carbon to the host.
NASA Astrophysics Data System (ADS)
Chan, Chia-Hsin; Tu, Chun-Chuan; Tsai, Wen-Jiin
2017-01-01
High efficiency video coding (HEVC) not only improves the coding efficiency drastically compared to the well-known H.264/AVC but also introduces coding tools for parallel processing, one of which is tiles. Tile partitioning is allowed to be arbitrary in HEVC, but how to decide tile boundaries remains an open issue. An adaptive tile boundary (ATB) method is proposed to select a better tile partitioning to improve load balancing (ATB-LoadB) and coding efficiency (ATB-Gain) with a unified scheme. Experimental results show that, compared to ordinary uniform-space partitioning, the proposed ATB can save up to 17.65% of encoding times in parallel encoding scenarios and can reduce up to 0.8% of total bit rates for coding efficiency.
Bandwidth efficient coding for satellite communications
NASA Technical Reports Server (NTRS)
Lin, Shu; Costello, Daniel J., Jr.; Miller, Warner H.; Morakis, James C.; Poland, William B., Jr.
1992-01-01
An error control coding scheme was devised to achieve large coding gain and high reliability by using coded modulation with reduced decoding complexity. To achieve a 3 to 5 dB coding gain and moderate reliability, the decoding complexity is quite modest. In fact, to achieve a 3 dB coding gain, the decoding complexity is quite simple, no matter whether trellis coded modulation or block coded modulation is used. However, to achieve coding gains exceeding 5 dB, the decoding complexity increases drastically, and the implementation of the decoder becomes very expensive and unpractical. The use is proposed of coded modulation in conjunction with concatenated (or cascaded) coding. A good short bandwidth efficient modulation code is used as the inner code and relatively powerful Reed-Solomon code is used as the outer code. With properly chosen inner and outer codes, a concatenated coded modulation scheme not only can achieve large coding gains and high reliability with good bandwidth efficiency but also can be practically implemented. This combination of coded modulation and concatenated coding really offers a way of achieving the best of three worlds, reliability and coding gain, bandwidth efficiency, and decoding complexity.
Experimental Definition and Validation of Protein Coding Transcripts in Chlamydomonas reinhardtii
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kourosh Salehi-Ashtiani; Jason A. Papin
Algal fuel sources promise unsurpassed yields in a carbon neutral manner that minimizes resource competition between agriculture and fuel crops. Many challenges must be addressed before algal biofuels can be accepted as a component of the fossil fuel replacement strategy. One significant challenge is that the cost of algal fuel production must become competitive with existing fuel alternatives. Algal biofuel production presents the opportunity to fine-tune microbial metabolic machinery for an optimal blend of biomass constituents and desired fuel molecules. Genome-scale model-driven algal metabolic design promises to facilitate both goals by directing the utilization of metabolites in the complex, interconnectedmore » metabolic networks to optimize production of the compounds of interest. Using Chlamydomonas reinhardtii as a model, we developed a systems-level methodology bridging metabolic network reconstruction with annotation and experimental verification of enzyme encoding open reading frames. We reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. Our approach to generate a predictive metabolic model integrated with cloned open reading frames, provides a cost-effective platform to generate metabolic engineering resources. While the generated resources are specific to algal systems, the approach that we have developed is not specific to algae and can be readily expanded to other microbial systems as well as higher plants and animals.« less
Data compression for satellite images
NASA Technical Reports Server (NTRS)
Chen, P. H.; Wintz, P. A.
1976-01-01
An efficient data compression system is presented for satellite pictures and two grey level pictures derived from satellite pictures. The compression techniques take advantages of the correlation between adjacent picture elements. Several source coding methods are investigated. Double delta coding is presented and shown to be the most efficient. Both predictive differential quantizing technique and double delta coding can be significantly improved by applying a background skipping technique. An extension code is constructed. This code requires very little storage space and operates efficiently. Simulation results are presented for various coding schemes and source codes.
High-efficiency Gaussian key reconciliation in continuous variable quantum key distribution
NASA Astrophysics Data System (ADS)
Bai, ZengLiang; Wang, XuYang; Yang, ShenShen; Li, YongMin
2016-01-01
Efficient reconciliation is a crucial step in continuous variable quantum key distribution. The progressive-edge-growth (PEG) algorithm is an efficient method to construct relatively short block length low-density parity-check (LDPC) codes. The qua-sicyclic construction method can extend short block length codes and further eliminate the shortest cycle. In this paper, by combining the PEG algorithm and qua-si-cyclic construction method, we design long block length irregular LDPC codes with high error-correcting capacity. Based on these LDPC codes, we achieve high-efficiency Gaussian key reconciliation with slice recon-ciliation based on multilevel coding/multistage decoding with an efficiency of 93.7%.
Luan, Jun-Bo; Chen, Wenbo; Hasegawa, Daniel K; Simmons, Alvin M; Wintermantel, William M; Ling, Kai-Shu; Fei, Zhangjun; Liu, Shu-Sheng; Douglas, Angela E
2015-09-15
Genomic decay is a common feature of intracellular bacteria that have entered into symbiosis with plant sap-feeding insects. This study of the whitefly Bemisia tabaci and two bacteria (Portiera aleyrodidarum and Hamiltonella defensa) cohoused in each host cell investigated whether the decay of Portiera metabolism genes is complemented by host and Hamiltonella genes, and compared the metabolic traits of the whitefly symbiosis with other sap-feeding insects (aphids, psyllids, and mealybugs). Parallel genomic and transcriptomic analysis revealed that the host genome contributes multiple metabolic reactions that complement or duplicate Portiera function, and that Hamiltonella may contribute multiple cofactors and one essential amino acid, lysine. Homologs of the Bemisia metabolism genes of insect origin have also been implicated in essential amino acid synthesis in other sap-feeding insect hosts, indicative of parallel coevolution of shared metabolic pathways across multiple symbioses. Further metabolism genes coded in the Bemisia genome are of bacterial origin, but phylogenetically distinct from Portiera, Hamiltonella and horizontally transferred genes identified in other sap-feeding insects. Overall, 75% of the metabolism genes of bacterial origin are functionally unique to one symbiosis, indicating that the evolutionary history of metabolic integration in these symbioses is strongly contingent on the pattern of horizontally acquired genes. Our analysis, further, shows that bacteria with genomic decay enable host acquisition of complex metabolic pathways by multiple independent horizontal gene transfers from exogenous bacteria. Specifically, each horizontally acquired gene can function with other genes in the pathway coded by the symbiont, while facilitating the decay of the symbiont gene coding the same reaction. © The Author(s) 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Shih, Jing-Wen; Wang, Ling-Yu; Hung, Chiu-Lien; Kung, Hsing-Jien; Hsieh, Chia-Ling
2015-12-04
Hormone-refractory prostate cancer frequently relapses from therapy and inevitably progresses to a bone-metastatic status with no cure. Understanding of the molecular mechanisms conferring resistance to androgen deprivation therapy has the potential to lead to the discovery of novel therapeutic targets for type of prostate cancer with poor prognosis. Progression to castration-resistant prostate cancer (CRPC) is characterized by aberrant androgen receptor (AR) expression and persistent AR signaling activity. Alterations in metabolic activity regulated by oncogenic pathways, such as c-Myc, were found to promote prostate cancer growth during the development of CRPC. Non-coding RNAs represent a diverse family of regulatory transcripts that drive tumorigenesis of prostate cancer and various other cancers by their hyperactivity or diminished function. A number of studies have examined differentially expressed non-coding RNAs in each stage of prostate cancer. Herein, we highlight the emerging impacts of microRNAs and long non-coding RNAs linked to reactivation of the AR signaling axis and reprogramming of the cellular metabolism in prostate cancer. The translational implications of non-coding RNA research for developing new biomarkers and therapeutic strategies for CRPC are also discussed.
Shih, Jing-Wen; Wang, Ling-Yu; Hung, Chiu-Lien; Kung, Hsing-Jien; Hsieh, Chia-Ling
2015-01-01
Hormone-refractory prostate cancer frequently relapses from therapy and inevitably progresses to a bone-metastatic status with no cure. Understanding of the molecular mechanisms conferring resistance to androgen deprivation therapy has the potential to lead to the discovery of novel therapeutic targets for type of prostate cancer with poor prognosis. Progression to castration-resistant prostate cancer (CRPC) is characterized by aberrant androgen receptor (AR) expression and persistent AR signaling activity. Alterations in metabolic activity regulated by oncogenic pathways, such as c-Myc, were found to promote prostate cancer growth during the development of CRPC. Non-coding RNAs represent a diverse family of regulatory transcripts that drive tumorigenesis of prostate cancer and various other cancers by their hyperactivity or diminished function. A number of studies have examined differentially expressed non-coding RNAs in each stage of prostate cancer. Herein, we highlight the emerging impacts of microRNAs and long non-coding RNAs linked to reactivation of the AR signaling axis and reprogramming of the cellular metabolism in prostate cancer. The translational implications of non-coding RNA research for developing new biomarkers and therapeutic strategies for CRPC are also discussed. PMID:26690121
Wieczorek, Aneta; Fornalewicz, Karolina; Mocarski, Łukasz; Łyżeń, Robert; Węgrzyn, Grzegorz
2018-04-15
Genetic evidence for a link between DNA replication and glycolysis has been demonstrated a decade ago in Bacillus subtilis, where temperature-sensitive mutations in genes coding for replication proteins could be suppressed by mutations in genes of glycolytic enzymes. Then, a strong influence of dysfunctions of particular enzymes from the central carbon metabolism (CCM) on DNA replication and repair in Escherichia coli was reported. Therefore, we asked if such a link occurs only in bacteria or it is a more general phenomenon. Here, we demonstrate that effects of silencing (provoked by siRNA) of expression of genes coding for proteins involved in DNA replication and repair (primase, DNA polymerase ι, ligase IV, and topoisomerase IIIβ) on these processes (less efficient entry into the S phase of the cell cycle and decreased level of DNA synthesis) could be suppressed by silencing of specific genes of enzymes from CMM. Silencing of other pairs of replication/repair and CMM genes resulted in enhancement of the negative effects of lower expression levels of replication/repair genes. We suggest that these results may be proposed as a genetic evidence for the link between DNA replication/repair and CMM in human cells, indicating that it is a common biological phenomenon, occurring from bacteria to humans. Copyright © 2018 Elsevier B.V. All rights reserved.
Non-Genomic Origins of Proteins and Metabolism
NASA Technical Reports Server (NTRS)
Pohorille, Andrew
2003-01-01
It is proposed that evolution of inanimate matter to cells endowed with a nucleic acid- based coding of genetic information was preceded by an evolutionary phase, in which peptides not coded by nucleic acids were able to self-organize into networks capable of evolution towards increasing metabolic complexity. Recent findings that truly different, simple peptides (Keefe and Szostak, 2001) can perform the same function (such as ATP binding) provide experimental support for this mechanism of early protobiological evolution. The central concept underlying this mechanism is that the reproduction of cellular functions alone was sufficient for self-maintenance of protocells, and that self- replication of macromolecules was not required at this stage of evolution. The precise transfer of information between successive generations of the earliest protocells was unnecessary and, possibly, undesirable. The key requirement in the initial stage of protocellular evolution was an ability to rapidly explore a large number of protein sequences in order to discover a set of molecules capable of supporting self- maintenance and growth of protocells. Undoubtedly, the essential protocellular functions were carried out by molecules not nearly as efficient or as specific as contemporary proteins. Many, potentially unrelated sequences could have performed each of these functions at an evolutionarily acceptable level. As evolution progressed, however proteins must have performed their functions with increasing efficiency and specificity. This, in turn, put additional constraints on protein sequences and the fraction of proteins capable of performing their functions at the required level decreased. At some point, the likelihood of generating a sufficiently efficient set of proteins through a non-coded synthesis was so small that further evolution was not possible without storing information about the sequences of these proteins. Beyond this point, further evolution required coupling between proteins and informational polymers that is characteristic to all known forms of life. The emergence of such coupling must be postulated in any scenario of the origin of life, no matter whether it starts with RNA or proteins. To examine the evolutionary potential of non-genomic systems, a simple, computationally tractable model, which is still capable of capturing the essential features of the real system, has been studied computationally. Both constructive and destructive processes have been introduced into the model in a stochastic manner. Instead of assuming random reaction sets, only a suite of protobiologically plausible reactions has been considered. Peptides have been explicitly considered as protoenzymes and their catalytic efficiencies have been assigned on the basis of biochemical principles and experimental estimates. Simulations have been carried out using a novel approach (The Next Reaction Method) that is appropriate even for very low concentrations of reactants. Studies have focused on global autocatalytic processes and their diversity.
From chemical metabolism to life: the origin of the genetic coding process
2017-01-01
Looking for origins is so much rooted in ideology that most studies reflect opinions that fail to explore the first realistic scenarios. To be sure, trying to understand the origins of life should be based on what we know of current chemistry in the solar system and beyond. There, amino acids and very small compounds such as carbon dioxide, dihydrogen or dinitrogen and their immediate derivatives are ubiquitous. Surface-based chemical metabolism using these basic chemicals is the most likely beginning in which amino acids, coenzymes and phosphate-based small carbon molecules were built up. Nucleotides, and of course RNAs, must have come to being much later. As a consequence, the key question to account for life is to understand how chemical metabolism that began with amino acids progressively shaped into a coding process involving RNAs. Here I explore the role of building up complementarity rules as the first information-based process that allowed for the genetic code to emerge, after RNAs were substituted to surfaces to carry over the basic metabolic pathways that drive the pursuit of life. PMID:28684991
Meydan, Chanan; Bekenstein, Uriya; Soreq, Hermona
2018-01-01
Sepsis and metabolic syndrome (MetS) are both inflammation-related entities with high impact for human health and the consequences of concussions. Both represent imbalanced parasympathetic/cholinergic response to insulting triggers and variably uncontrolled inflammation that indicates shared upstream regulators, including short microRNAs (miRs) and long non-coding RNAs (lncRNAs). These may cross talk across multiple systems, leading to complex molecular and clinical outcomes. Notably, biomedical and RNA-sequencing based analyses both highlight new links between the acquired and inherited pathogenic, cardiac and inflammatory traits of sepsis/MetS. Those include the HOTAIR and MIAT lncRNAs and their targets, such as miR-122, -150, -155, -182, -197, -375, -608 and HLA-DRA. Implicating non-coding RNA regulators in sepsis and MetS may delineate novel high-value biomarkers and targets for intervention.
Maturation of metabolic connectivity of the adolescent rat brain
Choi, Hongyoon; Choi, Yoori; Kim, Kyu Wan; Kang, Hyejin; Hwang, Do Won; Kim, E Edmund; Chung, June-Key; Lee, Dong Soo
2015-01-01
Neuroimaging has been used to examine developmental changes of the brain. While PET studies revealed maturation-related changes, maturation of metabolic connectivity of the brain is not yet understood. Here, we show that rat brain metabolism is reconfigured to achieve long-distance connections with higher energy efficiency during maturation. Metabolism increased in anterior cerebrum and decreased in thalamus and cerebellum during maturation. When functional covariance patterns of PET images were examined, metabolic networks including default mode network (DMN) were extracted. Connectivity increased between the anterior and posterior parts of DMN and sensory-motor cortices during maturation. Energy efficiency, a ratio of connectivity strength to metabolism of a region, increased in medial prefrontal and retrosplenial cortices. Our data revealed that metabolic networks mature to increase metabolic connections and establish its efficiency between large-scale spatial components from childhood to early adulthood. Neurodevelopmental diseases might be understood by abnormal reconfiguration of metabolic connectivity and efficiency. DOI: http://dx.doi.org/10.7554/eLife.11571.001 PMID:26613413
NASA Astrophysics Data System (ADS)
Yu, Lianchun; Liu, Liwei
2014-03-01
The generation and conduction of action potentials (APs) represents a fundamental means of communication in the nervous system and is a metabolically expensive process. In this paper, we investigate the energy efficiency of neural systems in transferring pulse signals with APs. By analytically solving a bistable neuron model that mimics the AP generation with a particle crossing the barrier of a double well, we find the optimal number of ion channels that maximizes the energy efficiency of a neuron. We also investigate the energy efficiency of a neuron population in which the input pulse signals are represented with synchronized spikes and read out with a downstream coincidence detector neuron. We find an optimal number of neurons in neuron population, as well as the number of ion channels in each neuron that maximizes the energy efficiency. The energy efficiency also depends on the characters of the input signals, e.g., the pulse strength and the interpulse intervals. These results are confirmed by computer simulation of the stochastic Hodgkin-Huxley model with a detailed description of the ion channel random gating. We argue that the tradeoff between signal transmission reliability and energy cost may influence the size of the neural systems when energy use is constrained.
Yu, Lianchun; Liu, Liwei
2014-03-01
The generation and conduction of action potentials (APs) represents a fundamental means of communication in the nervous system and is a metabolically expensive process. In this paper, we investigate the energy efficiency of neural systems in transferring pulse signals with APs. By analytically solving a bistable neuron model that mimics the AP generation with a particle crossing the barrier of a double well, we find the optimal number of ion channels that maximizes the energy efficiency of a neuron. We also investigate the energy efficiency of a neuron population in which the input pulse signals are represented with synchronized spikes and read out with a downstream coincidence detector neuron. We find an optimal number of neurons in neuron population, as well as the number of ion channels in each neuron that maximizes the energy efficiency. The energy efficiency also depends on the characters of the input signals, e.g., the pulse strength and the interpulse intervals. These results are confirmed by computer simulation of the stochastic Hodgkin-Huxley model with a detailed description of the ion channel random gating. We argue that the tradeoff between signal transmission reliability and energy cost may influence the size of the neural systems when energy use is constrained.
The balance sheet for transcription: an analysis of nuclear RNA metabolism in mammalian cells.
Jackson, D A; Pombo, A; Iborra, F
2000-02-01
The control of RNA synthesis from protein-coding genes is fundamental in determining the various cell types of higher eukaryotes. The activation of these genes is driven by promoter complexes, and RNA synthesis is performed by an enzyme mega-complex-the RNA polymerase II holoenzyme. These two complexes are the fundamental components required to initiate gene expression and generate the primary transcripts that, after processing, yield mRNAs that pass to the cytoplasm where protein synthesis occurs. But although this gene expression pathway has been studied intensively, aspects of RNA metabolism remain difficult to comprehend. In particular, it is unclear why >95% of RNA polymerized by polymerase II remains in the nucleus, where it is recycled. To explain this apparent paradox, this review presents a detailed description of nuclear RNA (nRNA) metabolism in mammalian cells. We evaluate the number of active transcription units, discuss the distribution of polymerases on active genes, and assess the efficiency with which the products mature and pass to the cytoplasm. Differences between the behavior of mRNAs on this productive pathway and primary transcripts that never leave the nucleus lead us to propose that these represent distinct populations. We discuss possible roles for nonproductive RNAs and present a model to describe the metabolism of these RNAs in the nuclei of mammalian cells.-Jackson, D. A., Pombo, A., Iborra, F. The balance sheet for transcription: an analysis of nuclear RNA metabolism in mammalian cells.
Efficient Polar Coding of Quantum Information
NASA Astrophysics Data System (ADS)
Renes, Joseph M.; Dupuis, Frédéric; Renner, Renato
2012-08-01
Polar coding, introduced 2008 by Arıkan, is the first (very) efficiently encodable and decodable coding scheme whose information transmission rate provably achieves the Shannon bound for classical discrete memoryless channels in the asymptotic limit of large block sizes. Here, we study the use of polar codes for the transmission of quantum information. Focusing on the case of qubit Pauli channels and qubit erasure channels, we use classical polar codes to construct a coding scheme that asymptotically achieves a net transmission rate equal to the coherent information using efficient encoding and decoding operations and code construction. Our codes generally require preshared entanglement between sender and receiver, but for channels with a sufficiently low noise level we demonstrate that the rate of preshared entanglement required is zero.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1988-01-01
During the period December 1, 1987 through May 31, 1988, progress was made in the following areas: construction of Multi-Dimensional Bandwidth Efficient Trellis Codes with MPSK modulation; performance analysis of Bandwidth Efficient Trellis Coded Modulation schemes; and performance analysis of Bandwidth Efficient Trellis Codes on Fading Channels.
Młynarski, Wiktor
2014-01-01
To date a number of studies have shown that receptive field shapes of early sensory neurons can be reproduced by optimizing coding efficiency of natural stimulus ensembles. A still unresolved question is whether the efficient coding hypothesis explains formation of neurons which explicitly represent environmental features of different functional importance. This paper proposes that the spatial selectivity of higher auditory neurons emerges as a direct consequence of learning efficient codes for natural binaural sounds. Firstly, it is demonstrated that a linear efficient coding transform—Independent Component Analysis (ICA) trained on spectrograms of naturalistic simulated binaural sounds extracts spatial information present in the signal. A simple hierarchical ICA extension allowing for decoding of sound position is proposed. Furthermore, it is shown that units revealing spatial selectivity can be learned from a binaural recording of a natural auditory scene. In both cases a relatively small subpopulation of learned spectrogram features suffices to perform accurate sound localization. Representation of the auditory space is therefore learned in a purely unsupervised way by maximizing the coding efficiency and without any task-specific constraints. This results imply that efficient coding is a useful strategy for learning structures which allow for making behaviorally vital inferences about the environment. PMID:24639644
A reduced complexity highly power/bandwidth efficient coded FQPSK system with iterative decoding
NASA Technical Reports Server (NTRS)
Simon, M. K.; Divsalar, D.
2001-01-01
Based on a representation of FQPSK as a trellis-coded modulation, this paper investigates the potential improvement in power efficiency obtained from the application of simple outer codes to form a concatenated coding arrangement with iterative decoding.
Least reliable bits coding (LRBC) for high data rate satellite communications
NASA Technical Reports Server (NTRS)
Vanderaar, Mark; Budinger, James; Wagner, Paul
1992-01-01
LRBC, a bandwidth efficient multilevel/multistage block-coded modulation technique, is analyzed. LRBC uses simple multilevel component codes that provide increased error protection on increasingly unreliable modulated bits in order to maintain an overall high code rate that increases spectral efficiency. Soft-decision multistage decoding is used to make decisions on unprotected bits through corrections made on more protected bits. Analytical expressions and tight performance bounds are used to show that LRBC can achieve increased spectral efficiency and maintain equivalent or better power efficiency compared to that of BPSK. The relative simplicity of Galois field algebra vs the Viterbi algorithm and the availability of high-speed commercial VLSI for block codes indicates that LRBC using block codes is a desirable method for high data rate implementations.
Cooperative MIMO communication at wireless sensor network: an error correcting code approach.
Islam, Mohammad Rakibul; Han, Young Shin
2011-01-01
Cooperative communication in wireless sensor network (WSN) explores the energy efficient wireless communication schemes between multiple sensors and data gathering node (DGN) by exploiting multiple input multiple output (MIMO) and multiple input single output (MISO) configurations. In this paper, an energy efficient cooperative MIMO (C-MIMO) technique is proposed where low density parity check (LDPC) code is used as an error correcting code. The rate of LDPC code is varied by varying the length of message and parity bits. Simulation results show that the cooperative communication scheme outperforms SISO scheme in the presence of LDPC code. LDPC codes with different code rates are compared using bit error rate (BER) analysis. BER is also analyzed under different Nakagami fading scenario. Energy efficiencies are compared for different targeted probability of bit error p(b). It is observed that C-MIMO performs more efficiently when the targeted p(b) is smaller. Also the lower encoding rate for LDPC code offers better error characteristics.
Cooperative MIMO Communication at Wireless Sensor Network: An Error Correcting Code Approach
Islam, Mohammad Rakibul; Han, Young Shin
2011-01-01
Cooperative communication in wireless sensor network (WSN) explores the energy efficient wireless communication schemes between multiple sensors and data gathering node (DGN) by exploiting multiple input multiple output (MIMO) and multiple input single output (MISO) configurations. In this paper, an energy efficient cooperative MIMO (C-MIMO) technique is proposed where low density parity check (LDPC) code is used as an error correcting code. The rate of LDPC code is varied by varying the length of message and parity bits. Simulation results show that the cooperative communication scheme outperforms SISO scheme in the presence of LDPC code. LDPC codes with different code rates are compared using bit error rate (BER) analysis. BER is also analyzed under different Nakagami fading scenario. Energy efficiencies are compared for different targeted probability of bit error pb. It is observed that C-MIMO performs more efficiently when the targeted pb is smaller. Also the lower encoding rate for LDPC code offers better error characteristics. PMID:22163732
Pierce, Brandon L; Tong, Lin; Argos, Maria; Gao, Jianjun; Farzana, Jasmine; Roy, Shantanu; Paul-Brutus, Rachelle; Rahaman, Ronald; Rakibuz-Zaman, Muhammad; Parvez, Faruque; Ahmed, Alauddin; Quasem, Iftekhar; Hore, Samar K; Alam, Shafiul; Islam, Tariqul; Harjes, Judith; Sarwar, Golam; Slavkovich, Vesna; Gamble, Mary V; Chen, Yu; Yunus, Mohammad; Rahman, Mahfuzar; Baron, John A; Graziano, Joseph H; Ahsan, Habibul
2013-12-01
Arsenic exposure through drinking water is a serious global health issue. Observational studies suggest that individuals who metabolize arsenic efficiently are at lower risk for toxicities such as arsenical skin lesions. Using two single nucleotide polymorphisms(SNPs) in the 10q24.32 region (near AS3MT) that show independent associations with metabolism efficiency, Mendelian randomization can be used to assess whether the association between metabolism efficiency and skin lesions is likely to be causal. Using data on 2060 arsenic-exposed Bangladeshi individuals, we estimated associations for two 10q24.32 SNPs with relative concentrations of three urinary arsenic species (representing metabolism efficiency): inorganic arsenic (iAs), monomethylarsonic acid(MMA) and dimethylarsinic acid (DMA). SNP-based predictions of iAs%, MMA% and DMA% were tested for association with skin lesion status among 2483 cases and 2857 controls. Causal odds ratios for skin lesions were 0.90 (95% confidence interval[CI]: 0.87, 0.95), 1.19 (CI: 1.10, 1.28) and 1.23 (CI: 1.12, 1.36)for a one standard deviation increase in DMA%, MMA% and iAs%,respectively. We demonstrated genotype-arsenic interaction, with metabolism-related variants showing stronger associations with skin lesion risk among individuals with high arsenic exposure (synergy index: 1.37; CI: 1.11, 1.62). We provide strong evidence for a causal relationship between arsenic metabolism efficiency and skin lesion risk. Mendelian randomization can be used to assess the causal role of arsenic exposure and metabolism in a wide array of health conditions.exposure and metabolism in a wide array of health conditions.Developing interventions that increase arsenic metabolism efficiency are likely to reduce the impact of arsenic exposure on health.
Scalable Coding of Plenoptic Images by Using a Sparse Set and Disparities.
Li, Yun; Sjostrom, Marten; Olsson, Roger; Jennehag, Ulf
2016-01-01
One of the light field capturing techniques is the focused plenoptic capturing. By placing a microlens array in front of the photosensor, the focused plenoptic cameras capture both spatial and angular information of a scene in each microlens image and across microlens images. The capturing results in a significant amount of redundant information, and the captured image is usually of a large resolution. A coding scheme that removes the redundancy before coding can be of advantage for efficient compression, transmission, and rendering. In this paper, we propose a lossy coding scheme to efficiently represent plenoptic images. The format contains a sparse image set and its associated disparities. The reconstruction is performed by disparity-based interpolation and inpainting, and the reconstructed image is later employed as a prediction reference for the coding of the full plenoptic image. As an outcome of the representation, the proposed scheme inherits a scalable structure with three layers. The results show that plenoptic images are compressed efficiently with over 60 percent bit rate reduction compared with High Efficiency Video Coding intra coding, and with over 20 percent compared with an High Efficiency Video Coding block copying mode.
Overcoming Codes and Standards Barriers to Innovations in Building Energy Efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Pamala C.; Gilbride, Theresa L.
2015-02-15
In this journal article, the authors discuss approaches to overcoming building code barriers to energy-efficiency innovations in home construction. Building codes have been a highly motivational force for increasing the energy efficiency of new homes in the United States in recent years. But as quickly as the codes seem to be changing, new products are coming to the market at an even more rapid pace, sometimes offering approaches and construction techniques unthought of when the current code was first proposed, which might have been several years before its adoption by various jurisdictions. Due to this delay, the codes themselves canmore » become barriers to innovations that might otherwise be helping to further increase the efficiency, comfort, health or durability of new homes. . The U.S. Department of Energy’s Building America, a program dedicated to improving the energy efficiency of America’s housing stock through research and education, is working with the U.S. housing industry through its research teams to help builders identify and remove code barriers to innovation in the home construction industry. The article addresses several approaches that builders use to achieve approval for innovative building techniques when code barriers appear to exist.« less
An Efficient Variable Length Coding Scheme for an IID Source
NASA Technical Reports Server (NTRS)
Cheung, K. -M.
1995-01-01
A scheme is examined for using two alternating Huffman codes to encode a discrete independent and identically distributed source with a dominant symbol. This combined strategy, or alternating runlength Huffman (ARH) coding, was found to be more efficient than ordinary coding in certain circumstances.
Binary video codec for data reduction in wireless visual sensor networks
NASA Astrophysics Data System (ADS)
Khursheed, Khursheed; Ahmad, Naeem; Imran, Muhammad; O'Nils, Mattias
2013-02-01
Wireless Visual Sensor Networks (WVSN) is formed by deploying many Visual Sensor Nodes (VSNs) in the field. Typical applications of WVSN include environmental monitoring, health care, industrial process monitoring, stadium/airports monitoring for security reasons and many more. The energy budget in the outdoor applications of WVSN is limited to the batteries and the frequent replacement of batteries is usually not desirable. So the processing as well as the communication energy consumption of the VSN needs to be optimized in such a way that the network remains functional for longer duration. The images captured by VSN contain huge amount of data and require efficient computational resources for processing the images and wide communication bandwidth for the transmission of the results. Image processing algorithms must be designed and developed in such a way that they are computationally less complex and must provide high compression rate. For some applications of WVSN, the captured images can be segmented into bi-level images and hence bi-level image coding methods will efficiently reduce the information amount in these segmented images. But the compression rate of the bi-level image coding methods is limited by the underlined compression algorithm. Hence there is a need for designing other intelligent and efficient algorithms which are computationally less complex and provide better compression rate than that of bi-level image coding methods. Change coding is one such algorithm which is computationally less complex (require only exclusive OR operations) and provide better compression efficiency compared to image coding but it is effective for applications having slight changes between adjacent frames of the video. The detection and coding of the Region of Interest (ROIs) in the change frame efficiently reduce the information amount in the change frame. But, if the number of objects in the change frames is higher than a certain level then the compression efficiency of both the change coding and ROI coding becomes worse than that of image coding. This paper explores the compression efficiency of the Binary Video Codec (BVC) for the data reduction in WVSN. We proposed to implement all the three compression techniques i.e. image coding, change coding and ROI coding at the VSN and then select the smallest bit stream among the results of the three compression techniques. In this way the compression performance of the BVC will never become worse than that of image coding. We concluded that the compression efficiency of BVC is always better than that of change coding and is always better than or equal that of ROI coding and image coding.
Coding For Compression Of Low-Entropy Data
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu
1994-01-01
Improved method of encoding digital data provides for efficient lossless compression of partially or even mostly redundant data from low-information-content source. Method of coding implemented in relatively simple, high-speed arithmetic and logic circuits. Also increases coding efficiency beyond that of established Huffman coding method in that average number of bits per code symbol can be less than 1, which is the lower bound for Huffman code.
Near-optimal experimental design for model selection in systems biology.
Busetto, Alberto Giovanni; Hauser, Alain; Krummenacher, Gabriel; Sunnåker, Mikael; Dimopoulos, Sotiris; Ong, Cheng Soon; Stelling, Jörg; Buhmann, Joachim M
2013-10-15
Biological systems are understood through iterations of modeling and experimentation. Not all experiments, however, are equally valuable for predictive modeling. This study introduces an efficient method for experimental design aimed at selecting dynamical models from data. Motivated by biological applications, the method enables the design of crucial experiments: it determines a highly informative selection of measurement readouts and time points. We demonstrate formal guarantees of design efficiency on the basis of previous results. By reducing our task to the setting of graphical models, we prove that the method finds a near-optimal design selection with a polynomial number of evaluations. Moreover, the method exhibits the best polynomial-complexity constant approximation factor, unless P = NP. We measure the performance of the method in comparison with established alternatives, such as ensemble non-centrality, on example models of different complexity. Efficient design accelerates the loop between modeling and experimentation: it enables the inference of complex mechanisms, such as those controlling central metabolic operation. Toolbox 'NearOED' available with source code under GPL on the Machine Learning Open Source Software Web site (mloss.org).
DCT based interpolation filter for motion compensation in HEVC
NASA Astrophysics Data System (ADS)
Alshin, Alexander; Alshina, Elena; Park, Jeong Hoon; Han, Woo-Jin
2012-10-01
High Efficiency Video Coding (HEVC) draft standard has a challenging goal to improve coding efficiency twice compare to H.264/AVC. Many aspects of the traditional hybrid coding framework were improved during new standard development. Motion compensated prediction, in particular the interpolation filter, is one area that was improved significantly over H.264/AVC. This paper presents the details of the interpolation filter design of the draft HEVC standard. The coding efficiency improvements over H.264/AVC interpolation filter is studied and experimental results are presented, which show a 4.0% average bitrate reduction for Luma component and 11.3% average bitrate reduction for Chroma component. The coding efficiency gains are significant for some video sequences and can reach up 21.7%.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1989-01-01
The performance of bandwidth efficient trellis codes on channels with phase jitter, or those disturbed by jamming and impulse noise is analyzed. An heuristic algorithm for construction of bandwidth efficient trellis codes with any constraint length up to about 30, any signal constellation, and any code rate was developed. Construction of good distance profile trellis codes for sequential decoding and comparison of random coding bounds of trellis coded modulation schemes are also discussed.
NIBBS-search for fast and accurate prediction of phenotype-biased metabolic systems.
Schmidt, Matthew C; Rocha, Andrea M; Padmanabhan, Kanchana; Shpanskaya, Yekaterina; Banfield, Jill; Scott, Kathleen; Mihelcic, James R; Samatova, Nagiza F
2012-01-01
Understanding of genotype-phenotype associations is important not only for furthering our knowledge on internal cellular processes, but also essential for providing the foundation necessary for genetic engineering of microorganisms for industrial use (e.g., production of bioenergy or biofuels). However, genotype-phenotype associations alone do not provide enough information to alter an organism's genome to either suppress or exhibit a phenotype. It is important to look at the phenotype-related genes in the context of the genome-scale network to understand how the genes interact with other genes in the organism. Identification of metabolic subsystems involved in the expression of the phenotype is one way of placing the phenotype-related genes in the context of the entire network. A metabolic system refers to a metabolic network subgraph; nodes are compounds and edges labels are the enzymes that catalyze the reaction. The metabolic subsystem could be part of a single metabolic pathway or span parts of multiple pathways. Arguably, comparative genome-scale metabolic network analysis is a promising strategy to identify these phenotype-related metabolic subsystems. Network Instance-Based Biased Subgraph Search (NIBBS) is a graph-theoretic method for genome-scale metabolic network comparative analysis that can identify metabolic systems that are statistically biased toward phenotype-expressing organismal networks. We set up experiments with target phenotypes like hydrogen production, TCA expression, and acid-tolerance. We show via extensive literature search that some of the resulting metabolic subsystems are indeed phenotype-related and formulate hypotheses for other systems in terms of their role in phenotype expression. NIBBS is also orders of magnitude faster than MULE, one of the most efficient maximal frequent subgraph mining algorithms that could be adjusted for this problem. Also, the set of phenotype-biased metabolic systems output by NIBBS comes very close to the set of phenotype-biased subgraphs output by an exact maximally-biased subgraph enumeration algorithm ( MBS-Enum ). The code (NIBBS and the module to visualize the identified subsystems) is available at http://freescience.org/cs/NIBBS.
NIBBS-Search for Fast and Accurate Prediction of Phenotype-Biased Metabolic Systems
Padmanabhan, Kanchana; Shpanskaya, Yekaterina; Banfield, Jill; Scott, Kathleen; Mihelcic, James R.; Samatova, Nagiza F.
2012-01-01
Understanding of genotype-phenotype associations is important not only for furthering our knowledge on internal cellular processes, but also essential for providing the foundation necessary for genetic engineering of microorganisms for industrial use (e.g., production of bioenergy or biofuels). However, genotype-phenotype associations alone do not provide enough information to alter an organism's genome to either suppress or exhibit a phenotype. It is important to look at the phenotype-related genes in the context of the genome-scale network to understand how the genes interact with other genes in the organism. Identification of metabolic subsystems involved in the expression of the phenotype is one way of placing the phenotype-related genes in the context of the entire network. A metabolic system refers to a metabolic network subgraph; nodes are compounds and edges labels are the enzymes that catalyze the reaction. The metabolic subsystem could be part of a single metabolic pathway or span parts of multiple pathways. Arguably, comparative genome-scale metabolic network analysis is a promising strategy to identify these phenotype-related metabolic subsystems. Network Instance-Based Biased Subgraph Search (NIBBS) is a graph-theoretic method for genome-scale metabolic network comparative analysis that can identify metabolic systems that are statistically biased toward phenotype-expressing organismal networks. We set up experiments with target phenotypes like hydrogen production, TCA expression, and acid-tolerance. We show via extensive literature search that some of the resulting metabolic subsystems are indeed phenotype-related and formulate hypotheses for other systems in terms of their role in phenotype expression. NIBBS is also orders of magnitude faster than MULE, one of the most efficient maximal frequent subgraph mining algorithms that could be adjusted for this problem. Also, the set of phenotype-biased metabolic systems output by NIBBS comes very close to the set of phenotype-biased subgraphs output by an exact maximally-biased subgraph enumeration algorithm ( MBS-Enum ). The code (NIBBS and the module to visualize the identified subsystems) is available at http://freescience.org/cs/NIBBS. PMID:22589706
CREPT-MCNP code for efficiency calibration of HPGe detectors with the representative point method.
Saegusa, Jun
2008-01-01
The representative point method for the efficiency calibration of volume samples has been previously proposed. For smoothly implementing the method, a calculation code named CREPT-MCNP has been developed. The code estimates the position of a representative point which is intrinsic to each shape of volume sample. The self-absorption correction factors are also given to make correction on the efficiencies measured at the representative point with a standard point source. Features of the CREPT-MCNP code are presented.
Least Reliable Bits Coding (LRBC) for high data rate satellite communications
NASA Technical Reports Server (NTRS)
Vanderaar, Mark; Wagner, Paul; Budinger, James
1992-01-01
An analysis and discussion of a bandwidth efficient multi-level/multi-stage block coded modulation technique called Least Reliable Bits Coding (LRBC) is presented. LRBC uses simple multi-level component codes that provide increased error protection on increasingly unreliable modulated bits in order to maintain an overall high code rate that increases spectral efficiency. Further, soft-decision multi-stage decoding is used to make decisions on unprotected bits through corrections made on more protected bits. Using analytical expressions and tight performance bounds it is shown that LRBC can achieve increased spectral efficiency and maintain equivalent or better power efficiency compared to that of Binary Phase Shift Keying (BPSK). Bit error rates (BER) vs. channel bit energy with Additive White Gaussian Noise (AWGN) are given for a set of LRB Reed-Solomon (RS) encoded 8PSK modulation formats with an ensemble rate of 8/9. All formats exhibit a spectral efficiency of 2.67 = (log2(8))(8/9) information bps/Hz. Bit by bit coded and uncoded error probabilities with soft-decision information are determined. These are traded with with code rate to determine parameters that achieve good performance. The relative simplicity of Galois field algebra vs. the Viterbi algorithm and the availability of high speed commercial Very Large Scale Integration (VLSI) for block codes indicates that LRBC using block codes is a desirable method for high data rate implementations.
Concatenated Coding Using Trellis-Coded Modulation
NASA Technical Reports Server (NTRS)
Thompson, Michael W.
1997-01-01
In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.
Bandwidth efficient CCSDS coding standard proposals
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.; Perez, Lance C.; Wang, Fu-Quan
1992-01-01
The basic concatenated coding system for the space telemetry channel consists of a Reed-Solomon (RS) outer code, a symbol interleaver/deinterleaver, and a bandwidth efficient trellis inner code. A block diagram of this configuration is shown. The system may operate with or without the outer code and interleaver. In this recommendation, the outer code remains the (255,223) RS code over GF(2 exp 8) with an error correcting capability of t = 16 eight bit symbols. This code's excellent performance and the existence of fast, cost effective, decoders justify its continued use. The purpose of the interleaver/deinterleaver is to distribute burst errors out of the inner decoder over multiple codewords of the outer code. This utilizes the error correcting capability of the outer code more efficiently and reduces the probability of an RS decoder failure. Since the space telemetry channel is not considered bursty, the required interleaving depth is primarily a function of the inner decoding method. A diagram of an interleaver with depth 4 that is compatible with the (255,223) RS code is shown. Specific interleaver requirements are discussed after the inner code recommendations.
Optimized atom position and coefficient coding for matching pursuit-based image compression.
Shoa, Alireza; Shirani, Shahram
2009-12-01
In this paper, we propose a new encoding algorithm for matching pursuit image coding. We show that coding performance is improved when correlations between atom positions and atom coefficients are both used in encoding. We find the optimum tradeoff between efficient atom position coding and efficient atom coefficient coding and optimize the encoder parameters. Our proposed algorithm outperforms the existing coding algorithms designed for matching pursuit image coding. Additionally, we show that our algorithm results in better rate distortion performance than JPEG 2000 at low bit rates.
Mastronicola, Daniela; Testa, Fabrizio; Forte, Elena; Bordi, Eugenio; Pucillo, Leopoldo Paolo; Sarti, Paolo; Giuffrè, Alessandro
2010-09-03
Flavohemoglobins (flavoHbs), commonly found in bacteria and fungi, afford protection from nitrosative stress by degrading nitric oxide (NO) to nitrate. Giardia intestinalis, a microaerophilic parasite causing one of the most common intestinal human infectious diseases worldwide, is the only pathogenic protozoon as yet identified coding for a flavoHb. By NO amperometry we show that, in the presence of NADH, the recombinant Giardia flavoHb metabolizes NO with high efficacy under aerobic conditions (TN=116+/-10s(-1) at 1microM NO, T=37 degrees C). The activity is [O(2)]-dependent and characterized by an apparent K(M,O2)=22+/-7microM. Immunoblotting analysis shows that the protein is expressed at low levels in the vegetative trophozoites of Giardia; accordingly, these cells aerobically metabolize NO with low efficacy. Interestingly, in response to nitrosative stress (24-h incubation with 5mM nitrite) flavoHb expression is enhanced and the trophozoites thereby become able to metabolize NO efficiently, the activity being sensitive to both cyanide and carbon monoxide. The NO-donors S-nitrosoglutathione (GSNO) and DETA-NONOate mimicked the effect of nitrite on flavoHb expression. We propose that physiologically flavoHb contributes to NO detoxification in G. intestinalis. Copyright 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Sikder, Somali; Ghosh, Shila
2018-02-01
This paper presents the construction of unipolar transposed modified Walsh code (TMWC) and analysis of its performance in optical code-division multiple-access (OCDMA) systems. Specifically, the signal-to-noise ratio, bit error rate (BER), cardinality, and spectral efficiency were investigated. The theoretical analysis demonstrated that the wavelength-hopping time-spreading system using TMWC was robust against multiple-access interference and more spectrally efficient than systems using other existing OCDMA codes. In particular, the spectral efficiency was calculated to be 1.0370 when TMWC of weight 3 was employed. The BER and eye pattern for the designed TMWC were also successfully obtained using OptiSystem simulation software. The results indicate that the proposed code design is promising for enhancing network capacity.
Probabilistic Amplitude Shaping With Hard Decision Decoding and Staircase Codes
NASA Astrophysics Data System (ADS)
Sheikh, Alireza; Amat, Alexandre Graell i.; Liva, Gianluigi; Steiner, Fabian
2018-05-01
We consider probabilistic amplitude shaping (PAS) as a means of increasing the spectral efficiency of fiber-optic communication systems. In contrast to previous works in the literature, we consider probabilistic shaping with hard decision decoding (HDD). In particular, we apply the PAS recently introduced by B\\"ocherer \\emph{et al.} to a coded modulation (CM) scheme with bit-wise HDD that uses a staircase code as the forward error correction code. We show that the CM scheme with PAS and staircase codes yields significant gains in spectral efficiency with respect to the baseline scheme using a staircase code and a standard constellation with uniformly distributed signal points. Using a single staircase code, the proposed scheme achieves performance within $0.57$--$1.44$ dB of the corresponding achievable information rate for a wide range of spectral efficiencies.
Space shuttle main engine numerical modeling code modifications and analysis
NASA Technical Reports Server (NTRS)
Ziebarth, John P.
1988-01-01
The user of computational fluid dynamics (CFD) codes must be concerned with the accuracy and efficiency of the codes if they are to be used for timely design and analysis of complicated three-dimensional fluid flow configurations. A brief discussion of how accuracy and efficiency effect the CFD solution process is given. A more detailed discussion of how efficiency can be enhanced by using a few Cray Research Inc. utilities to address vectorization is presented and these utilities are applied to a three-dimensional Navier-Stokes CFD code (INS3D).
Hardware-efficient bosonic quantum error-correcting codes based on symmetry operators
NASA Astrophysics Data System (ADS)
Niu, Murphy Yuezhen; Chuang, Isaac L.; Shapiro, Jeffrey H.
2018-03-01
We establish a symmetry-operator framework for designing quantum error-correcting (QEC) codes based on fundamental properties of the underlying system dynamics. Based on this framework, we propose three hardware-efficient bosonic QEC codes that are suitable for χ(2 )-interaction based quantum computation in multimode Fock bases: the χ(2 ) parity-check code, the χ(2 ) embedded error-correcting code, and the χ(2 ) binomial code. All of these QEC codes detect photon-loss or photon-gain errors by means of photon-number parity measurements, and then correct them via χ(2 ) Hamiltonian evolutions and linear-optics transformations. Our symmetry-operator framework provides a systematic procedure for finding QEC codes that are not stabilizer codes, and it enables convenient extension of a given encoding to higher-dimensional qudit bases. The χ(2 ) binomial code is of special interest because, with m ≤N identified from channel monitoring, it can correct m -photon-loss errors, or m -photon-gain errors, or (m -1 )th -order dephasing errors using logical qudits that are encoded in O (N ) photons. In comparison, other bosonic QEC codes require O (N2) photons to correct the same degree of bosonic errors. Such improved photon efficiency underscores the additional error-correction power that can be provided by channel monitoring. We develop quantum Hamming bounds for photon-loss errors in the code subspaces associated with the χ(2 ) parity-check code and the χ(2 ) embedded error-correcting code, and we prove that these codes saturate their respective bounds. Our χ(2 ) QEC codes exhibit hardware efficiency in that they address the principal error mechanisms and exploit the available physical interactions of the underlying hardware, thus reducing the physical resources required for implementing their encoding, decoding, and error-correction operations, and their universal encoded-basis gate sets.
Zhang, Yunfang; Zhang, Xudong; Shi, Junchao; Tuorto, Francesca; Li, Xin; Liu, Yusheng; Liebers, Reinhard; Zhang, Liwen; Qu, Yongcun; Qian, Jingjing; Pahima, Maya; Liu, Ying; Yan, Menghong; Cao, Zhonghong; Lei, Xiaohua; Cao, Yujing; Peng, Hongying; Liu, Shichao; Wang, Yue; Zheng, Huili; Woolsey, Rebekah; Quilici, David; Zhai, Qiwei; Li, Lei; Zhou, Tong; Yan, Wei; Lyko, Frank; Zhang, Ying; Zhou, Qi; Duan, Enkui; Chen, Qi
2018-05-01
The discovery of RNAs (for example, messenger RNAs, non-coding RNAs) in sperm has opened the possibility that sperm may function by delivering additional paternal information aside from solely providing the DNA 1 . Increasing evidence now suggests that sperm small non-coding RNAs (sncRNAs) can mediate intergenerational transmission of paternally acquired phenotypes, including mental stress 2,3 and metabolic disorders 4-6 . How sperm sncRNAs encode paternal information remains unclear, but the mechanism may involve RNA modifications. Here we show that deletion of a mouse tRNA methyltransferase, DNMT2, abolished sperm sncRNA-mediated transmission of high-fat-diet-induced metabolic disorders to offspring. Dnmt2 deletion prevented the elevation of RNA modifications (m 5 C, m 2 G) in sperm 30-40 nt RNA fractions that are induced by a high-fat diet. Also, Dnmt2 deletion altered the sperm small RNA expression profile, including levels of tRNA-derived small RNAs and rRNA-derived small RNAs, which might be essential in composing a sperm RNA 'coding signature' that is needed for paternal epigenetic memory. Finally, we show that Dnmt2-mediated m 5 C contributes to the secondary structure and biological properties of sncRNAs, implicating sperm RNA modifications as an additional layer of paternal hereditary information.
Green's function methods in heavy ion shielding
NASA Technical Reports Server (NTRS)
Wilson, John W.; Costen, Robert C.; Shinn, Judy L.; Badavi, Francis F.
1993-01-01
An analytic solution to the heavy ion transport in terms of Green's function is used to generate a highly efficient computer code for space applications. The efficiency of the computer code is accomplished by a nonperturbative technique extending Green's function over the solution domain. The computer code can also be applied to accelerator boundary conditions to allow code validation in laboratory experiments.
Enhancing Scalability and Efficiency of the TOUGH2_MP for LinuxClusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Keni; Wu, Yu-Shu
2006-04-17
TOUGH2{_}MP, the parallel version TOUGH2 code, has been enhanced by implementing more efficient communication schemes. This enhancement is achieved through reducing the amount of small-size messages and the volume of large messages. The message exchange speed is further improved by using non-blocking communications for both linear and nonlinear iterations. In addition, we have modified the AZTEC parallel linear-equation solver to nonblocking communication. Through the improvement of code structuring and bug fixing, the new version code is now more stable, while demonstrating similar or even better nonlinear iteration converging speed than the original TOUGH2 code. As a result, the new versionmore » of TOUGH2{_}MP is improved significantly in its efficiency. In this paper, the scalability and efficiency of the parallel code are demonstrated by solving two large-scale problems. The testing results indicate that speedup of the code may depend on both problem size and complexity. In general, the code has excellent scalability in memory requirement as well as computing time.« less
Turpin, Williams; Weiman, Marion; Guyot, Jean-Pierre; Lajus, Aurélie; Cruveiller, Stéphane; Humblot, Christèle
2018-02-02
The objective of this work was to investigate the nutritional potential of Lactobacillus plantarum A6 in a food matrix using next generation sequencing. To this end, we characterized the genome of the A6 strain for a complete overview of its potential. We then compared its transcriptome when grown in a food matrix made from pearl millet to and its transcriptome when cultivated in a laboratory medium. Genomic comparison of the strain L. plantarum A6 with the strains WCFS1, ST-III, JDM1 and ATCC14917 led to the identification of five regions of genomic plasticity. More specifically, 362 coding sequences, mostly annotated as coding for proteins of unknown functions, were specific to L. plantarum A6. A total of 1201 genes were significantly differentially expressed in laboratory medium and food matrix. Among them, 821 genes were up-regulated in the food matrix compared to the laboratory medium, representing 23% of whole genomic objects. In the laboratory medium, the expression of 380 genes, representing 11% of the all genomic objects was at least double than in the food matrix. Genes encoding important functions for the nutritional quality of the food were identified. Considering its efficiency as an amylolytic strain, we investigated all genes involved in carbohydrate metabolism, paying particular attention to starch metabolism. An extracellular alpha amylase, a neopullulanase and maltodextrin transporters were identified, all of which were highly expressed in the food matrix. In addition, genes involved in alpha-galactoside metabolism were identified but only two of them were induced in food matrix than in laboratory medium. This may be because alpha galactosides were already eliminated during soaking. Different biosynthetic pathways involved in the synthesis of vitamin B (folate, riboflavin, and cobalamin) were identified. They allowed the identification of a potential of vitamin synthesis, which should be confirmed through biochemical analysis in further work. Surprisingly, some genes involved in metabolism and bioaccessibility of iron were identified. They were related directly to the use of transport of iron, or indirectly to metabolism of polyphenols, responsible of iron chelation in the food. The combination of genomics and transcriptomics not only revealed previously undocumented nutritional properties of L. plantarum A6, but also documented the behaviour of this bacterium in food. Copyright © 2017 Elsevier B.V. All rights reserved.
Protein-DNA binding dynamics predict transcriptional response to nutrients in archaea.
Todor, Horia; Sharma, Kriti; Pittman, Adrianne M C; Schmid, Amy K
2013-10-01
Organisms across all three domains of life use gene regulatory networks (GRNs) to integrate varied stimuli into coherent transcriptional responses to environmental pressures. However, inferring GRN topology and regulatory causality remains a central challenge in systems biology. Previous work characterized TrmB as a global metabolic transcription factor in archaeal extremophiles. However, it remains unclear how TrmB dynamically regulates its ∼100 metabolic enzyme-coding gene targets. Using a dynamic perturbation approach, we elucidate the topology of the TrmB metabolic GRN in the model archaeon Halobacterium salinarum. Clustering of dynamic gene expression patterns reveals that TrmB functions alone to regulate central metabolic enzyme-coding genes but cooperates with various regulators to control peripheral metabolic pathways. Using a dynamical model, we predict gene expression patterns for some TrmB-dependent promoters and infer secondary regulators for others. Our data suggest feed-forward gene regulatory topology for cobalamin biosynthesis. In contrast, purine biosynthesis appears to require TrmB-independent regulators. We conclude that TrmB is an important component for mediating metabolic modularity, integrating nutrient status and regulating gene expression dynamics alone and in concert with secondary regulators.
Bardoni, Barbara; Abekhoukh, Sabiha; Zongaro, Samantha; Melko, Mireille
2012-01-01
Intellectual disability (ID) is the most frequent cause of serious handicap in children and young adults and interests 2-3% of worldwide population, representing a serious problem from the medical, social, and economic points of view. The causes are very heterogeneous. Genes involved in ID have various functions altering different pathways important in neuronal function. Regulation of mRNA metabolism is particularly important in neurons for synaptic structure and function. Here, we review ID due to alteration of mRNA metabolism. Functional absence of some RNA-binding proteins--namely, FMRP, FMR2P, PQBP1, UFP3B, VCX-A--causes different forms of ID. These proteins are involved in different steps of RNA metabolism and, even if a detailed analysis of their RNA targets has been performed so far only for FMRP, it appears clear that they modulate some aspects (translation, stability, transport, and sublocalization) of a subset of RNAs coding for proteins, whose function must be relevant for neurons. Two other proteins, DYRK1A and CDKL5, involved in Down syndrome and Rett syndrome, respectively, have been shown to have an impact on splicing efficiency of specific mRNAs. Both proteins are kinases and their effect is indirect. Interestingly, both are localized in nuclear speckles, the nuclear domains where splicing factors are assembled, stocked, and recycled and influence their biogenesis and/or their organization. Copyright © 2012 Elsevier B.V. All rights reserved.
Regulation of neuraminidase expression in Streptococcus pneumoniae
2012-01-01
Background Sialic acid (N-acetylneuraminic acid; NeuNAc) is one of the most important carbohydrates for Streptococcus pneumoniae due of its role as a carbon and energy source, receptor for adhesion and invasion and molecular signal for promotion of biofilm formation, nasopharyngeal carriage and invasion of the lung. Results In this work, NeuNAc and its metabolic derivative N-acetyl mannosamine (ManNAc) were used to analyze regulatory mechanisms of the neuraminidase locus expression. Genomic and metabolic comparison to Streptococcus mitis, Streptococcus oralis, Streptococcus gordonii and Streptococcus sanguinis elucidates the metabolic association of the two amino sugars to different parts of the locus coding for the two main pneumococcal neuraminidases and confirms the substrate specificity of the respective ABC transporters. Quantitative gene expression analysis shows repression of the locus by glucose and induction of all predicted transcriptional units by ManNAc and NeuNAc, each inducing with higher efficiency the operon encoding for the transporter with higher specificity for the respective amino sugar. Cytofluorimetric analysis demonstrated enhanced surface exposure of NanA on pneumococci grown in NeuNAc and ManNAc and an activity assay allowed to quantify approximately twelve times as much neuraminidase activity on induced cells as opposed to glucose grown cells. Conclusions The present data increase the understanding of metabolic regulation of the nanAB locus and indicate that experiments aimed at the elucidation of the relevance of neuraminidases in pneumococcal virulence should possibly not be carried out on bacteria grown in glucose containing media. PMID:22963456
A concatenated coding scheme for error control
NASA Technical Reports Server (NTRS)
Lin, S.
1985-01-01
A concatenated coding scheme for error contol in data communications was analyzed. The inner code is used for both error correction and detection, however the outer code is used only for error detection. A retransmission is requested if either the inner code decoder fails to make a successful decoding or the outer code decoder detects the presence of errors after the inner code decoding. Probability of undetected error of the proposed scheme is derived. An efficient method for computing this probability is presented. Throughout efficiency of the proposed error control scheme incorporated with a selective repeat ARQ retransmission strategy is analyzed.
Mechanisms of physiological and pathological cardiac hypertrophy.
Nakamura, Michinari; Sadoshima, Junichi
2018-04-19
Cardiomyocytes exit the cell cycle and become terminally differentiated soon after birth. Therefore, in the adult heart, instead of an increase in cardiomyocyte number, individual cardiomyocytes increase in size, and the heart develops hypertrophy to reduce ventricular wall stress and maintain function and efficiency in response to an increased workload. There are two types of hypertrophy: physiological and pathological. Hypertrophy initially develops as an adaptive response to physiological and pathological stimuli, but pathological hypertrophy generally progresses to heart failure. Each form of hypertrophy is regulated by distinct cellular signalling pathways. In the past decade, a growing number of studies have suggested that previously unrecognized mechanisms, including cellular metabolism, proliferation, non-coding RNAs, immune responses, translational regulation, and epigenetic modifications, positively or negatively regulate cardiac hypertrophy. In this Review, we summarize the underlying molecular mechanisms of physiological and pathological hypertrophy, with a particular emphasis on the role of metabolic remodelling in both forms of cardiac hypertrophy, and we discuss how the current knowledge on cardiac hypertrophy can be applied to develop novel therapeutic strategies to prevent or reverse pathological hypertrophy.
Energy and Environment Guide to Action - Chapter 4.3: Building Codes for Energy Efficiency
Provides guidance and recommendations for establishing, implementing, and evaluating state building codes for energy efficiency, which improve energy efficiency in new construction and major renovations. State success stories are included for reference.
Metabolic Free Energy and Biological Codes: A 'Data Rate Theorem' Aging Model.
Wallace, Rodrick
2015-06-01
A famous argument by Maturana and Varela (Autopoiesis and cognition. Reidel, Dordrecht, 1980) holds that the living state is cognitive at every scale and level of organization. Since it is possible to associate many cognitive processes with 'dual' information sources, pathologies can sometimes be addressed using statistical models based on the Shannon Coding, the Shannon-McMillan Source Coding, the Rate Distortion, and the Data Rate Theorems, which impose necessary conditions on information transmission and system control. Deterministic-but-for-error biological codes do not directly invoke cognition, but may be essential subcomponents within larger cognitive processes. A formal argument, however, places such codes within a similar framework, with metabolic free energy serving as a 'control signal' stabilizing biochemical code-and-translator dynamics in the presence of noise. Demand beyond available energy supply triggers punctuated destabilization of the coding channel, affecting essential biological functions. Aging, normal or prematurely driven by psychosocial or environmental stressors, must interfere with the routine operation of such mechanisms, initiating the chronic diseases associated with senescence. Amyloid fibril formation, intrinsically disordered protein logic gates, and cell surface glycan/lectin 'kelp bed' logic gates are reviewed from this perspective. The results generalize beyond coding machineries having easily recognizable symmetry modes, and strip a layer of mathematical complication from the study of phase transitions in nonequilibrium biological systems.
Posttranscriptional regulation of lipid metabolism by non-coding RNAs and RNA binding proteins.
Singh, Abhishek K; Aryal, Binod; Zhang, Xinbo; Fan, Yuhua; Price, Nathan L; Suárez, Yajaira; Fernández-Hernando, Carlos
2017-11-29
Alterations in lipoprotein metabolism enhance the risk of cardiometabolic disorders including type-2 diabetes and atherosclerosis, the leading cause of death in Western societies. While the transcriptional regulation of lipid metabolism has been well characterized, recent studies have uncovered the importance of microRNAs (miRNAs), long-non-coding RNAs (lncRNAs) and RNA binding proteins (RBP) in regulating the expression of lipid-related genes at the posttranscriptional level. Work from several groups has identified a number of miRNAs, including miR-33, miR-122 and miR-148a, that play a prominent role in controlling cholesterol homeostasis and lipoprotein metabolism. Importantly, dysregulation of miRNA expression has been associated with dyslipidemia, suggesting that manipulating the expression of these miRNAs could be a useful therapeutic approach to ameliorate cardiovascular disease (CVD). The role of lncRNAs in regulating lipid metabolism has recently emerged and several groups have demonstrated their regulation of lipoprotein metabolism. However, given the high abundance of lncRNAs and the poor-genetic conservation between species, much work will be needed to elucidate the specific role of lncRNAs in controlling lipoprotein metabolism. In this review article, we summarize recent findings in the field and highlight the specific contribution of lncRNAs and RBPs in regulating lipid metabolism. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Tsang, Sik-Ho; Chan, Yui-Lam; Siu, Wan-Chi
2017-01-01
Weighted prediction (WP) is an efficient video coding tool that was introduced since the establishment of the H.264/AVC video coding standard, for compensating the temporal illumination change in motion estimation and compensation. WP parameters, including a multiplicative weight and an additive offset for each reference frame, are required to be estimated and transmitted to the decoder by slice header. These parameters cause extra bits in the coded video bitstream. High efficiency video coding (HEVC) provides WP parameter prediction to reduce the overhead. Therefore, WP parameter prediction is crucial to research works or applications, which are related to WP. Prior art has been suggested to further improve the WP parameter prediction by implicit prediction of image characteristics and derivation of parameters. By exploiting both temporal and interlayer redundancies, we propose three WP parameter prediction algorithms, enhanced implicit WP parameter, enhanced direct WP parameter derivation, and interlayer WP parameter, to further improve the coding efficiency of HEVC. Results show that our proposed algorithms can achieve up to 5.83% and 5.23% bitrate reduction compared to the conventional scalable HEVC in the base layer for SNR scalability and 2× spatial scalability, respectively.
NASA Astrophysics Data System (ADS)
Kumar, Nitin; Singh, Udaybir; Kumar, Anil; Bhattacharya, Ranajoy; Singh, T. P.; Sinha, A. K.
2013-02-01
The design of 120 GHz, 1 MW gyrotron for plasma fusion application is presented in this paper. The mode selection is carried out considering the aim of minimum mode competition, minimum cavity wall heating, etc. On the basis of the selected operating mode, the interaction cavity design and beam-wave interaction computation are carried out by using the PIC code. The design of triode type Magnetron Injection Gun (MIG) is also presented. Trajectory code EGUN, synthesis code MIGSYN and data analysis code MIGANS are used in the MIG designing. Further, the design of MIG is also validated by using the another trajectory code TRAK. The design results of beam dumping system (collector) and RF window are also presented. Depressed collector is designed to enhance the overall tube efficiency. The design study confirms >1 MW output power with tube efficiency around 50% (with collector efficiency).
Rosen, Lisa M.; Liu, Tao; Merchant, Roland C.
2016-01-01
BACKGROUND Blood and body fluid exposures are frequently evaluated in emergency departments (EDs). However, efficient and effective methods for estimating their incidence are not yet established. OBJECTIVE Evaluate the efficiency and accuracy of estimating statewide ED visits for blood or body fluid exposures using International Classification of Diseases, Ninth Revision (ICD-9), code searches. DESIGN Secondary analysis of a database of ED visits for blood or body fluid exposure. SETTING EDs of 11 civilian hospitals throughout Rhode Island from January 1, 1995, through June 30, 2001. PATIENTS Patients presenting to the ED for possible blood or body fluid exposure were included, as determined by prespecified ICD-9 codes. METHODS Positive predictive values (PPVs) were estimated to determine the ability of 10 ICD-9 codes to distinguish ED visits for blood or body fluid exposure from ED visits that were not for blood or body fluid exposure. Recursive partitioning was used to identify an optimal subset of ICD-9 codes for this purpose. Random-effects logistic regression modeling was used to examine variations in ICD-9 coding practices and styles across hospitals. Cluster analysis was used to assess whether the choice of ICD-9 codes was similar across hospitals. RESULTS The PPV for the original 10 ICD-9 codes was 74.4% (95% confidence interval [CI], 73.2%–75.7%), whereas the recursive partitioning analysis identified a subset of 5 ICD-9 codes with a PPV of 89.9% (95% CI, 88.9%–90.8%) and a misclassification rate of 10.1%. The ability, efficiency, and use of the ICD-9 codes to distinguish types of ED visits varied across hospitals. CONCLUSIONS Although an accurate subset of ICD-9 codes could be identified, variations across hospitals related to hospital coding style, efficiency, and accuracy greatly affected estimates of the number of ED visits for blood or body fluid exposure. PMID:22561713
Research on pre-processing of QR Code
NASA Astrophysics Data System (ADS)
Sun, Haixing; Xia, Haojie; Dong, Ning
2013-10-01
QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.
NASA Astrophysics Data System (ADS)
Sanchez, Gustavo; Marcon, César; Agostini, Luciano Volcan
2018-01-01
The 3D-high efficiency video coding has introduced tools to obtain higher efficiency in 3-D video coding, and most of them are related to the depth maps coding. Among these tools, the depth modeling mode-1 (DMM-1) focuses on better encoding edges regions of depth maps. The large memory required for storing all wedgelet patterns is one of the bottlenecks in the DMM-1 hardware design of both encoder and decoder since many patterns must be stored. Three algorithms to reduce the DMM-1 memory requirements and a hardware design targeting the most efficient among these algorithms are presented. Experimental results demonstrate that the proposed solutions surpass related works reducing up to 78.8% of the wedgelet memory, without degrading the encoding efficiency. Synthesis results demonstrate that the proposed algorithm reduces almost 75% of the power dissipation when compared to the standard approach.
A Quantitative Study of Oxygen as a Metabolic Regulator
NASA Technical Reports Server (NTRS)
Radhakrishnan, Krishnan; LaManna, Joseph C.; Cabrera, Marco E.
1999-01-01
An acute reduction in oxygen (O2) delivery to a tissue is generally associated with a decrease in phosphocreatine, increases in ADP, NADH/NAD, and inorganic phosphate, increased rates of glycolysis and lactate production, and reduced rates of pyruvate and fatty acid oxidation. However, given the complexity of the human bioenergetic system and its components, it is difficult to determine quantitatively how cellular metabolic processes interact to maintain ATP homeostasis during stress (e.g., hypoxia, ischemia, and exercise). Of special interest is the determination of mechanisms relating tissue oxygenation to observed metabolic responses at the tissue, organ, and whole body levels and the quantification of how changes in tissue O2 availability affect the pathways of ATP synthesis and the metabolites that control these pathways. In this study, we extend a previously developed mathematical model of human bioenergetics to provide a physicochemical framework that permits quantitative understanding of O2 as a metabolic regulator. Specifically, the enhancement permits studying the effects of variations in tissue oxygenation and in parameters controlling the rate of cellular respiration on glycolysis, lactate production, and pyruvate oxidation. The whole body is described as a bioenergetic system consisting of metabolically distinct tissue/organ subsystems that exchange materials with the blood. In order to study the dynamic response of each subsystem to stimuli, we solve the ordinary differential equations describing the temporal evolution of metabolite levels, given the initial concentrations. The solver used in the present study is the packaged code LSODE, as implemented in the NASA Lewis kinetics and sensitivity analysis code, LSENS. A major advantage of LSENS is the efficient procedures supporting systematic sensitivity analysis, which provides the basic methods for studying parameter sensitivities (i.e., changes in model behavior due to parameter variation). Sensitivity analysis establishes relationships between model predictions and problem parameters (i.e., initial concentrations, rate coefficients, etc). It helps determine the effects of uncertainties or changes in these input parameters on the predictions, which ultimately are compared with experimental observations in order to validate the model. Sensitivity analysis can identify parameters that must be determined accurately because of their large effect on the model predictions and parameters that need not be known with great precision because they have little or no effect on the solution. This capability may prove to be important in optimizing the design of experiments, thereby reducing the use of animals. This approach can be applied to study the metabolic effects of reduced oxygen delivery to cardiac muscle due to local myocardial ischemia and the effects of acute hypoxia on brain metabolism. Other important applications of sensitivity analysis include identification of quantitatively relevant pathways and biochemical species within an overall mechanism, when examining the effects of a genetic anomaly or pathological state on energetic system components and whole system behavior.
Classification of breast tissue in mammograms using efficient coding.
Costa, Daniel D; Campos, Lúcio F; Barros, Allan K
2011-06-24
Female breast cancer is the major cause of death by cancer in western countries. Efforts in Computer Vision have been made in order to improve the diagnostic accuracy by radiologists. Some methods of lesion diagnosis in mammogram images were developed based in the technique of principal component analysis which has been used in efficient coding of signals and 2D Gabor wavelets used for computer vision applications and modeling biological vision. In this work, we present a methodology that uses efficient coding along with linear discriminant analysis to distinguish between mass and non-mass from 5090 region of interest from mammograms. The results show that the best rates of success reached with Gabor wavelets and principal component analysis were 85.28% and 87.28%, respectively. In comparison, the model of efficient coding presented here reached up to 90.07%. Altogether, the results presented demonstrate that independent component analysis performed successfully the efficient coding in order to discriminate mass from non-mass tissues. In addition, we have observed that LDA with ICA bases showed high predictive performance for some datasets and thus provide significant support for a more detailed clinical investigation.
High-efficiency reconciliation for continuous variable quantum key distribution
NASA Astrophysics Data System (ADS)
Bai, Zengliang; Yang, Shenshen; Li, Yongmin
2017-04-01
Quantum key distribution (QKD) is the most mature application of quantum information technology. Information reconciliation is a crucial step in QKD and significantly affects the final secret key rates shared between two legitimate parties. We analyze and compare various construction methods of low-density parity-check (LDPC) codes and design high-performance irregular LDPC codes with a block length of 106. Starting from these good codes and exploiting the slice reconciliation technique based on multilevel coding and multistage decoding, we realize high-efficiency Gaussian key reconciliation with efficiency higher than 95% for signal-to-noise ratios above 1. Our demonstrated method can be readily applied in continuous variable QKD.
NASA Technical Reports Server (NTRS)
Chaderjian, Neal M.
1991-01-01
Computations from two Navier-Stokes codes, NSS and F3D, are presented for a tangent-ogive-cylinder body at high angle of attack. Features of this steady flow include a pair of primary vortices on the leeward side of the body as well as secondary vortices. The topological and physical plausibility of this vortical structure is discussed. The accuracy of these codes are assessed by comparison of the numerical solutions with experimental data. The effects of turbulence model, numerical dissipation, and grid refinement are presented. The overall efficiency of these codes are also assessed by examining their convergence rates, computational time per time step, and maximum allowable time step for time-accurate computations. Overall, the numerical results from both codes compared equally well with experimental data, however, the NSS code was found to be significantly more efficient than the F3D code.
Non-coding RNAs and Berberine: A new mechanism of its anti-diabetic activities.
Chang, Wenguang
2017-01-15
Type 2 Diabetes (T2D) is a metabolic disease with high mortality and morbidity. Non-coding RNAs, including small and long non-coding RNAs, are a novel class of functional RNA molecules that regulate multiple biological functions through diverse mechanisms. Studies in the last decade have demonstrated that non-coding RNAs may represent compelling therapeutic targets and play important roles in regulating the course of insulin resistance and T2D. Berberine, a plant-based alkaloid, has shown promise as an anti-hyperglycaemic, anti-hyperlipidaemic agent against T2D. Previous studies have primarily focused on a diverse array of efficacy end points of berberine in the pathogenesis of metabolic syndromes and inflammation or oxidative stress. Currently, an increasing number of studies have revealed the importance of non-coding RNAs as regulators of the anti-diabetic effects of berberine. The regulation of non-coding RNAs has been associated with several therapeutic actions of berberine in T2D progression. Thus, this review summarizes the anti-diabetic mechanisms of berberine by focusing on its role in regulating non-coding RNA, thus demonstrating that berberine exerts global anti-diabetic effects by targeting non-coding RNAs and that these effects involve several miRNAs, lncRNAs and multiple signal pathways, which may enhance the current understanding of the anti-diabetic mechanism actions of berberine and provide new pathological targets for the development of berberine-related drugs. Copyright © 2016 Elsevier B.V. All rights reserved.
Energy Efficiency Program Administrators and Building Energy Codes
Explore how energy efficiency program administrators have helped advance building energy codes at federal, state, and local levels—using technical, institutional, financial, and other resources—and discusses potential next steps.
Utilizing Spectrum Efficiently (USE)
2011-02-28
18 4.8 Space-Time Coded Asynchronous DS - CDMA with Decentralized MAI Suppression: Performance and...numerical results. 4.8 Space-Time Coded Asynchronous DS - CDMA with Decentralized MAI Suppression: Performance and Spectral Efficiency In [60] multiple...supported at a given signal-to-interference ratio in asynchronous direct-sequence code-division multiple-access ( DS - CDMA ) sys- tems was examined. It was
Bar Coding and Tracking in Pathology.
Hanna, Matthew G; Pantanowitz, Liron
2016-03-01
Bar coding and specimen tracking are intricately linked to pathology workflow and efficiency. In the pathology laboratory, bar coding facilitates many laboratory practices, including specimen tracking, automation, and quality management. Data obtained from bar coding can be used to identify, locate, standardize, and audit specimens to achieve maximal laboratory efficiency and patient safety. Variables that need to be considered when implementing and maintaining a bar coding and tracking system include assets to be labeled, bar code symbologies, hardware, software, workflow, and laboratory and information technology infrastructure as well as interoperability with the laboratory information system. This article addresses these issues, primarily focusing on surgical pathology. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kumar, Santosh; Chanderkanta; Amphawan, Angela
2016-04-01
Excess 3 code is one of the most important codes used for efficient data storage and transmission. It is a non-weighted code and also known as self complimenting code. In this paper, a four bit optical Excess 3 to BCD code converter is proposed using electro-optic effect inside lithium-niobate based Mach-Zehnder interferometers (MZIs). The MZI structures have powerful capability to switching an optical input signal to a desired output port. The paper constitutes a mathematical description of the proposed device and thereafter simulation using MATLAB. The study is verified using beam propagation method (BPM).
Bar Coding and Tracking in Pathology.
Hanna, Matthew G; Pantanowitz, Liron
2015-06-01
Bar coding and specimen tracking are intricately linked to pathology workflow and efficiency. In the pathology laboratory, bar coding facilitates many laboratory practices, including specimen tracking, automation, and quality management. Data obtained from bar coding can be used to identify, locate, standardize, and audit specimens to achieve maximal laboratory efficiency and patient safety. Variables that need to be considered when implementing and maintaining a bar coding and tracking system include assets to be labeled, bar code symbologies, hardware, software, workflow, and laboratory and information technology infrastructure as well as interoperability with the laboratory information system. This article addresses these issues, primarily focusing on surgical pathology. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Rice, R. F.
1974-01-01
End-to-end system considerations involving channel coding and data compression which could drastically improve the efficiency in communicating pictorial information from future planetary spacecraft are presented.
NASA Astrophysics Data System (ADS)
Chen, Gang; Yang, Bing; Zhang, Xiaoyun; Gao, Zhiyong
2017-07-01
The latest high efficiency video coding (HEVC) standard significantly increases the encoding complexity for improving its coding efficiency. Due to the limited computational capability of handheld devices, complexity constrained video coding has drawn great attention in recent years. A complexity control algorithm based on adaptive mode selection is proposed for interframe coding in HEVC. Considering the direct proportionality between encoding time and computational complexity, the computational complexity is measured in terms of encoding time. First, complexity is mapped to a target in terms of prediction modes. Then, an adaptive mode selection algorithm is proposed for the mode decision process. Specifically, the optimal mode combination scheme that is chosen through offline statistics is developed at low complexity. If the complexity budget has not been used up, an adaptive mode sorting method is employed to further improve coding efficiency. The experimental results show that the proposed algorithm achieves a very large complexity control range (as low as 10%) for the HEVC encoder while maintaining good rate-distortion performance. For the lowdelayP condition, compared with the direct resource allocation method and the state-of-the-art method, an average gain of 0.63 and 0.17 dB in BDPSNR is observed for 18 sequences when the target complexity is around 40%.
Subcellular localization of rat CYP2E1 impacts metabolic efficiency toward common substrates.
Hartman, Jessica H; Martin, H Cass; Caro, Andres A; Pearce, Amy R; Miller, Grover P
2015-12-02
Cytochrome P450 2E1 (CYP2E1) detoxifies or bioactivates many low molecular-weight compounds. Most knowledge about CYP2E1 activity relies on studies of the enzyme localized to endoplasmic reticulum (erCYP2E1); however, CYP2E1 undergoes transport to mitochondria (mtCYP2E1) and becomes metabolically active. We report the first comparison of in vitro steady-state kinetic profiles for erCYP2E1 and mtCYP2E1 oxidation of probe substrate 4-nitrophenol and pollutants styrene and aniline using subcellular fractions from rat liver. For all substrates, metabolic efficiency changed with substrate concentration for erCYP2E1 reflected in non-hyperbolic kinetic profiles but not for mtCYP2E1. Hyperbolic kinetic profiles for the mitochondrial enzyme were consistent with Michaelis-Menten mechanism in which metabolic efficiency was constant. By contrast, erCYP2E1 metabolism of 4-nitrophenol led to a loss of enzyme efficiency at high substrate concentrations when substrate inhibited the reaction. Similarly, aniline metabolism by erCYP2E1 demonstrated negative cooperativity as metabolic efficiency decreased with increasing substrate concentration. The opposite was observed for erCYP2E1 oxidation of styrene; the sigmoidal kinetic profile indicated increased efficiency at higher substrate concentrations. These mechanisms and CYP2E1 levels in mitochondria and endoplasmic reticulum were used to estimate the impact of CYP2E1 subcellular localization on metabolic flux of pollutants. Those models showed that erCYP2E1 mainly carries out aniline metabolism at all aniline concentrations. Conversely, mtCYP2E1 dominates styrene oxidation at low styrene concentrations and erCYP2E1 at higher concentrations. Taken together, subcellular localization of CYP2E1 results in distinctly different enzyme activities that could impact overall metabolic clearance and/or activation of substrates and thus impact the interpretation and prediction of toxicological outcomes. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Residential Building Energy Code Field Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Bartlett, M. Halverson, V. Mendon, J. Hathaway, Y. Xie
This document presents a methodology for assessing baseline energy efficiency in new single-family residential buildings and quantifying related savings potential. The approach was developed by Pacific Northwest National Laboratory (PNNL) for the U.S. Department of Energy (DOE) Building Energy Codes Program with the objective of assisting states as they assess energy efficiency in residential buildings and implementation of their building energy codes, as well as to target areas for improvement through energy codes and broader energy-efficiency programs. It is also intended to facilitate a consistent and replicable approach to research studies of this type and establish a transparent data setmore » to represent baseline construction practices across U.S. states.« less
Schnabel, M; Mann, D; Efe, T; Schrappe, M; V Garrel, T; Gotzen, L; Schaeg, M
2004-10-01
The introduction of the German Diagnostic Related Groups (D-DRG) system requires redesigning administrative patient management strategies. Wrong coding leads to inaccurate grouping and endangers the reimbursement of treatment costs. This situation emphasizes the roles of documentation and coding as factors of economical success. The aims of this study were to assess the quantity and quality of initial documentation and coding (ICD-10 and OPS-301) and find operative strategies to improve efficiency and strategic means to ensure optimal documentation and coding quality. In a prospective study, documentation and coding quality were evaluated in a standardized way by weekly assessment. Clinical data from 1385 inpatients were processed for initial correctness and quality of documentation and coding. Principal diagnoses were found to be accurate in 82.7% of cases, inexact in 7.1%, and wrong in 10.1%. Effects on financial returns occurred in 16%. Based on these findings, an optimized, interdisciplinary, and multiprofessional workflow on medical documentation, coding, and data control was developed. Workflow incorporating regular assessment of documentation and coding quality is required by the DRG system to ensure efficient accounting of hospital services. Interdisciplinary and multiprofessional cooperation is recognized to be an important factor in establishing an efficient workflow in medical documentation and coding.
NASA Technical Reports Server (NTRS)
Lin, Shu (Principal Investigator); Uehara, Gregory T.; Nakamura, Eric; Chu, Cecilia W. P.
1996-01-01
The (64, 40, 8) subcode of the third-order Reed-Muller (RM) code for high-speed satellite communications is proposed. The RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. The progress made toward achieving the goal of implementing a decoder system based upon this code is summarized. The development of the integrated circuit prototype sub-trellis IC, particularly focusing on the design methodology, is addressed.
Chamala, Srikar; Beckstead, Wesley A; Rowe, Mark J; McClellan, David A
2007-01-01
We investigated whether the effect of evolutionary selection on three recent Single Nucleotide Polymorphisms (SNPs) in the mitochondrial sub-haplogroups of Pima Indians is consistent with their effects on metabolic efficiency. The mitochondrial SNPs impact metabolic rate and respiratory quotient, and may be adaptations to caloric restriction in a desert habitat. Using TreeSAAP software, we examined evolutionary selection in 107 mammalian species at these SNPs, characterising the biochemical shifts produced by the amino acid substitutions. Our results suggest that two SNPs were affected by selection during mammalian evolution in a manner consistent with their effects on metabolic efficiency in Pima Indians.
Schwab, Stefan; Ramos, Humberto J; Souza, Emanuel M; Pedrosa, Fábio O; Yates, Marshall G; Chubatsu, Leda S; Rigo, Liu U
2007-05-01
Random mutagenesis using transposons with promoterless reporter genes has been widely used to examine differential gene expression patterns in bacteria. Using this approach, we have identified 26 genes of the endophytic nitrogen-fixing bacterium Herbaspirillum seropedicae regulated in response to ammonium content in the growth medium. These include nine genes involved in the transport of nitrogen compounds, such as the high-affinity ammonium transporter AmtB, and uptake systems for alternative nitrogen sources; nine genes coding for proteins responsible for restoring intracellular ammonium levels through enzymatic reactions, such as nitrogenase, amidase, and arginase; and a third group includes metabolic switch genes, coding for sensor kinases or transcription regulation factors, whose role in metabolism was previously unknown. Also, four genes identified were of unknown function. This paper describes their involvement in response to ammonium limitation. The results provide a preliminary profile of the metabolic response of Herbaspirillum seropedicae to ammonium stress.
NASA Astrophysics Data System (ADS)
da Silva, Thaísa Leal; Agostini, Luciano Volcan; da Silva Cruz, Luis A.
2014-05-01
Intra prediction is a very important tool in current video coding standards. High-efficiency video coding (HEVC) intra prediction presents relevant gains in encoding efficiency when compared to previous standards, but with a very important increase in the computational complexity since 33 directional angular modes must be evaluated. Motivated by this high complexity, this article presents a complexity reduction algorithm developed to reduce the HEVC intra mode decision complexity targeting multiview videos. The proposed algorithm presents an efficient fast intra prediction compliant with singleview and multiview video encoding. This fast solution defines a reduced subset of intra directions according to the video texture and it exploits the relationship between prediction units (PUs) of neighbor depth levels of the coding tree. This fast intra coding procedure is used to develop an inter-view prediction method, which exploits the relationship between the intra mode directions of adjacent views to further accelerate the intra prediction process in multiview video encoding applications. When compared to HEVC simulcast, our method achieves a complexity reduction of up to 47.77%, at the cost of an average BD-PSNR loss of 0.08 dB.
Evaluation of the efficiency and fault density of software generated by code generators
NASA Technical Reports Server (NTRS)
Schreur, Barbara
1993-01-01
Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.
A stimulus-dependent spike threshold is an optimal neural coder
Jones, Douglas L.; Johnson, Erik C.; Ratnam, Rama
2015-01-01
A neural code based on sequences of spikes can consume a significant portion of the brain's energy budget. Thus, energy considerations would dictate that spiking activity be kept as low as possible. However, a high spike-rate improves the coding and representation of signals in spike trains, particularly in sensory systems. These are competing demands, and selective pressure has presumably worked to optimize coding by apportioning a minimum number of spikes so as to maximize coding fidelity. The mechanisms by which a neuron generates spikes while maintaining a fidelity criterion are not known. Here, we show that a signal-dependent neural threshold, similar to a dynamic or adapting threshold, optimizes the trade-off between spike generation (encoding) and fidelity (decoding). The threshold mimics a post-synaptic membrane (a low-pass filter) and serves as an internal decoder. Further, it sets the average firing rate (the energy constraint). The decoding process provides an internal copy of the coding error to the spike-generator which emits a spike when the error equals or exceeds a spike threshold. When optimized, the trade-off leads to a deterministic spike firing-rule that generates optimally timed spikes so as to maximize fidelity. The optimal coder is derived in closed-form in the limit of high spike-rates, when the signal can be approximated as a piece-wise constant signal. The predicted spike-times are close to those obtained experimentally in the primary electrosensory afferent neurons of weakly electric fish (Apteronotus leptorhynchus) and pyramidal neurons from the somatosensory cortex of the rat. We suggest that KCNQ/Kv7 channels (underlying the M-current) are good candidates for the decoder. They are widely coupled to metabolic processes and do not inactivate. We conclude that the neural threshold is optimized to generate an energy-efficient and high-fidelity neural code. PMID:26082710
Enzyme clustering accelerates processing of intermediates through metabolic channeling
Castellana, Michele; Wilson, Maxwell Z.; Xu, Yifan; Joshi, Preeti; Cristea, Ileana M.; Rabinowitz, Joshua D.; Gitai, Zemer; Wingreen, Ned S.
2015-01-01
We present a quantitative model to demonstrate that coclustering multiple enzymes into compact agglomerates accelerates the processing of intermediates, yielding the same efficiency benefits as direct channeling, a well-known mechanism in which enzymes are funneled between enzyme active sites through a physical tunnel. The model predicts the separation and size of coclusters that maximize metabolic efficiency, and this prediction is in agreement with previously reported spacings between coclusters in mammalian cells. For direct validation, we study a metabolic branch point in Escherichia coli and experimentally confirm the model prediction that enzyme agglomerates can accelerate the processing of a shared intermediate by one branch, and thus regulate steady-state flux division. Our studies establish a quantitative framework to understand coclustering-mediated metabolic channeling and its application to both efficiency improvement and metabolic regulation. PMID:25262299
Vashisht, Rohit; Bhat, Ashwini G; Kushwaha, Shreeram; Bhardwaj, Anshu; Brahmachari, Samir K
2014-10-11
The effectiveness of current therapeutic regimens for Mycobacterium tuberculosis (Mtb) is diminished by the need for prolonged therapy and the rise of drug resistant/tolerant strains. This global health threat, despite decades of basic research and a wealth of legacy knowledge, is due to a lack of systems level understanding that can innovate the process of fast acting and high efficacy drug discovery. The enhanced functional annotations of the Mtb genome, which were previously obtained through a crowd sourcing approach was used to reconstruct the metabolic network of Mtb in a bottom up manner. We represent this information by developing a novel Systems Biology Spindle Map of Metabolism (SBSM) and comprehend its static and dynamic structure using various computational approaches based on simulation and design. The reconstructed metabolism of Mtb encompasses 961 metabolites, involved in 1152 reactions catalyzed by 890 protein coding genes, organized into 50 pathways. By accounting for static and dynamic analysis of SBSM in Mtb we identified various critical proteins required for the growth and survival of bacteria. Further, we assessed the potential of these proteins as putative drug targets that are fast acting and less toxic. Further, we formulate a novel concept of metabolic persister genes (MPGs) and compared our predictions with published in vitro and in vivo experimental evidence. Through such analyses, we report for the first time that de novo biosynthesis of NAD may give rise to bacterial persistence in Mtb under conditions of metabolic stress induced by conventional anti-tuberculosis therapy. We propose such MPG's as potential combination of drug targets for existing antibiotics that can improve their efficacy and efficiency for drug tolerant bacteria. The systems level framework formulated by us to identify potential non-toxic drug targets and strategies to circumvent the issue of bacterial persistence can substantially aid in the process of TB drug discovery and translational research.
Efficient Prediction Structures for H.264 Multi View Coding Using Temporal Scalability
NASA Astrophysics Data System (ADS)
Guruvareddiar, Palanivel; Joseph, Biju K.
2014-03-01
Prediction structures with "disposable view components based" hierarchical coding have been proven to be efficient for H.264 multi view coding. Though these prediction structures along with the QP cascading schemes provide superior compression efficiency when compared to the traditional IBBP coding scheme, the temporal scalability requirements of the bit stream could not be met to the fullest. On the other hand, a fully scalable bit stream, obtained by "temporal identifier based" hierarchical coding, provides a number of advantages including bit rate adaptations and improved error resilience, but lacks in compression efficiency when compared to the former scheme. In this paper it is proposed to combine the two approaches such that a fully scalable bit stream could be realized with minimal reduction in compression efficiency when compared to state-of-the-art "disposable view components based" hierarchical coding. Simulation results shows that the proposed method enables full temporal scalability with maximum BDPSNR reduction of only 0.34 dB. A novel method also has been proposed for the identification of temporal identifier for the legacy H.264/AVC base layer packets. Simulation results also show that this enables the scenario where the enhancement views could be extracted at a lower frame rate (1/2nd or 1/4th of base view) with average extraction time for a view component of only 0.38 ms.
An efficient graph theory based method to identify every minimal reaction set in a metabolic network
2014-01-01
Background Development of cells with minimal metabolic functionality is gaining importance due to their efficiency in producing chemicals and fuels. Existing computational methods to identify minimal reaction sets in metabolic networks are computationally expensive. Further, they identify only one of the several possible minimal reaction sets. Results In this paper, we propose an efficient graph theory based recursive optimization approach to identify all minimal reaction sets. Graph theoretical insights offer systematic methods to not only reduce the number of variables in math programming and increase its computational efficiency, but also provide efficient ways to find multiple optimal solutions. The efficacy of the proposed approach is demonstrated using case studies from Escherichia coli and Saccharomyces cerevisiae. In case study 1, the proposed method identified three minimal reaction sets each containing 38 reactions in Escherichia coli central metabolic network with 77 reactions. Analysis of these three minimal reaction sets revealed that one of them is more suitable for developing minimal metabolism cell compared to other two due to practically achievable internal flux distribution. In case study 2, the proposed method identified 256 minimal reaction sets from the Saccharomyces cerevisiae genome scale metabolic network with 620 reactions. The proposed method required only 4.5 hours to identify all the 256 minimal reaction sets and has shown a significant reduction (approximately 80%) in the solution time when compared to the existing methods for finding minimal reaction set. Conclusions Identification of all minimal reactions sets in metabolic networks is essential since different minimal reaction sets have different properties that effect the bioprocess development. The proposed method correctly identified all minimal reaction sets in a both the case studies. The proposed method is computationally efficient compared to other methods for finding minimal reaction sets and useful to employ with genome-scale metabolic networks. PMID:24594118
Coding efficiency of AVS 2.0 for CBAC and CABAC engines
NASA Astrophysics Data System (ADS)
Cui, Jing; Choi, Youngkyu; Chae, Soo-Ik
2015-12-01
In this paper we compare the coding efficiency of AVS 2.0[1] for engines of the Context-based Binary Arithmetic Coding (CBAC)[2] in the AVS 2.0 and the Context-Adaptive Binary Arithmetic Coder (CABAC)[3] in the HEVC[4]. For fair comparison, the CABAC is embedded in the reference code RD10.1 because the CBAC is in the HEVC in our previous work[5]. The rate estimation table is employed only for RDOQ in the RD code. To reduce the computation complexity of the video encoder, therefore we modified the RD code so that the rate estimation table is employed for all RDO decision. Furthermore, we also simplify the complexity of rate estimation table by reducing the bit depth of its fractional part to 2 from 8. The simulation result shows that the CABAC has the BD-rate loss of about 0.7% compared to the CBAC. It seems that the CBAC is a little more efficient than that the CABAC in the AVS 2.0.
Efficient Network Coding-Based Loss Recovery for Reliable Multicast in Wireless Networks
NASA Astrophysics Data System (ADS)
Chi, Kaikai; Jiang, Xiaohong; Ye, Baoliu; Horiguchi, Susumu
Recently, network coding has been applied to the loss recovery of reliable multicast in wireless networks [19], where multiple lost packets are XOR-ed together as one packet and forwarded via single retransmission, resulting in a significant reduction of bandwidth consumption. In this paper, we first prove that maximizing the number of lost packets for XOR-ing, which is the key part of the available network coding-based reliable multicast schemes, is actually a complex NP-complete problem. To address this limitation, we then propose an efficient heuristic algorithm for finding an approximately optimal solution of this optimization problem. Furthermore, we show that the packet coding principle of maximizing the number of lost packets for XOR-ing sometimes cannot fully exploit the potential coding opportunities, and we then further propose new heuristic-based schemes with a new coding principle. Simulation results demonstrate that the heuristic-based schemes have very low computational complexity and can achieve almost the same transmission efficiency as the current coding-based high-complexity schemes. Furthermore, the heuristic-based schemes with the new coding principle not only have very low complexity, but also slightly outperform the current high-complexity ones.
Stein, Colleen S; Jadiya, Pooja; Zhang, Xiaoming; McLendon, Jared M; Abouassaly, Gabrielle M; Witmer, Nathan H; Anderson, Ethan J; Elrod, John W; Boudreau, Ryan L
2018-06-26
Mitochondria are composed of many small proteins that control protein synthesis, complex assembly, metabolism, and ion and reactive oxygen species (ROS) handling. We show that a skeletal muscle- and heart-enriched long non-coding RNA, LINC00116, encodes a highly conserved 56-amino-acid microprotein that we named mitoregulin (Mtln). Mtln localizes to the inner mitochondrial membrane, where it binds cardiolipin and influences protein complex assembly. In cultured cells, Mtln overexpression increases mitochondrial membrane potential, respiration rates, and Ca 2+ retention capacity while decreasing mitochondrial ROS and matrix-free Ca 2+ . Mtln-knockout mice display perturbations in mitochondrial respiratory (super)complex formation and activity, fatty acid oxidation, tricarboxylic acid (TCA) cycle enzymes, and Ca 2+ retention capacity. Blue-native gel electrophoresis revealed that Mtln co-migrates alongside several complexes, including the complex I assembly module, complex V, and supercomplexes. Under denaturing conditions, Mtln remains in high-molecular-weight complexes, supporting its role as a sticky molecular tether that enhances respiratory efficiency by bolstering protein complex assembly and/or stability. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Oh, Eun Joong; Skerker, Jeffrey M.; Kim, Soo Rin; Wei, Na; Turner, Timothy L.; Maurer, Matthew J.; Arkin, Adam P.
2016-01-01
ABSTRACT Efficient microbial utilization of cellulosic sugars is essential for the economic production of biofuels and chemicals. Although the yeast Saccharomyces cerevisiae is a robust microbial platform widely used in ethanol plants using sugar cane and corn starch in large-scale operations, glucose repression is one of the significant barriers to the efficient fermentation of cellulosic sugar mixtures. A recent study demonstrated that intracellular utilization of cellobiose by engineered yeast expressing a cellobiose transporter (encoded by cdt-1) and an intracellular β-glucosidase (encoded by gh1-1) can alleviate glucose repression, resulting in the simultaneous cofermentation of cellobiose and nonglucose sugars. Here we report enhanced cellobiose fermentation by engineered yeast expressing cdt-1 and gh1-1 through laboratory evolution. When cdt-1 and gh1-1 were integrated into the genome of yeast, the single copy integrant showed a low cellobiose consumption rate. However, cellobiose fermentation rates by engineered yeast increased gradually during serial subcultures on cellobiose. Finally, an evolved strain exhibited a 15-fold-higher cellobiose fermentation rate. To identify the responsible mutations in the evolved strain, genome sequencing was performed. Interestingly, no mutations affecting cellobiose fermentation were identified, but the evolved strain contained 9 copies of cdt-1 and 23 copies of gh1-1. We also traced the copy numbers of cdt-1 and gh1-1 of mixed populations during the serial subcultures. The copy numbers of cdt-1 and gh1-1 in the cultures increased gradually with similar ratios as cellobiose fermentation rates of the cultures increased. These results suggest that the cellobiose assimilation pathway (transport and hydrolysis) might be a rate-limiting step in engineered yeast and copies of genes coding for metabolic enzymes might be amplified in yeast if there is a growth advantage. This study indicates that on-demand gene amplification might be an efficient strategy for yeast metabolic engineering. IMPORTANCE In order to enable rapid and efficient fermentation of cellulosic hydrolysates by engineered yeast, we delve into the limiting factors of cellobiose fermentation by engineered yeast expressing a cellobiose transporter (encoded by cdt-1) and an intracellular β-glucosidase (encoded by gh1-1). Through laboratory evolution, we isolated mutant strains capable of fermenting cellobiose much faster than a parental strain. Genome sequencing of the fast cellobiose-fermenting mutant reveals that there are massive amplifications of cdt-1 and gh1-1 in the yeast genome. We also found positive and quantitative relationships between the rates of cellobiose consumption and the copy numbers of cdt-1 and gh1-1 in the evolved strains. Our results suggest that the cellobiose assimilation pathway (transport and hydrolysis) might be a rate-limiting step for efficient cellobiose fermentation. We demonstrate the feasibility of optimizing not only heterologous metabolic pathways in yeast through laboratory evolution but also on-demand gene amplification in yeast, which can be broadly applicable for metabolic engineering. PMID:27084006
Oh, Eun Joong; Skerker, Jeffrey M; Kim, Soo Rin; Wei, Na; Turner, Timothy L; Maurer, Matthew J; Arkin, Adam P; Jin, Yong-Su
2016-06-15
Efficient microbial utilization of cellulosic sugars is essential for the economic production of biofuels and chemicals. Although the yeast Saccharomyces cerevisiae is a robust microbial platform widely used in ethanol plants using sugar cane and corn starch in large-scale operations, glucose repression is one of the significant barriers to the efficient fermentation of cellulosic sugar mixtures. A recent study demonstrated that intracellular utilization of cellobiose by engineered yeast expressing a cellobiose transporter (encoded by cdt-1) and an intracellular β-glucosidase (encoded by gh1-1) can alleviate glucose repression, resulting in the simultaneous cofermentation of cellobiose and nonglucose sugars. Here we report enhanced cellobiose fermentation by engineered yeast expressing cdt-1 and gh1-1 through laboratory evolution. When cdt-1 and gh1-1 were integrated into the genome of yeast, the single copy integrant showed a low cellobiose consumption rate. However, cellobiose fermentation rates by engineered yeast increased gradually during serial subcultures on cellobiose. Finally, an evolved strain exhibited a 15-fold-higher cellobiose fermentation rate. To identify the responsible mutations in the evolved strain, genome sequencing was performed. Interestingly, no mutations affecting cellobiose fermentation were identified, but the evolved strain contained 9 copies of cdt-1 and 23 copies of gh1-1 We also traced the copy numbers of cdt-1 and gh1-1 of mixed populations during the serial subcultures. The copy numbers of cdt-1 and gh1-1 in the cultures increased gradually with similar ratios as cellobiose fermentation rates of the cultures increased. These results suggest that the cellobiose assimilation pathway (transport and hydrolysis) might be a rate-limiting step in engineered yeast and copies of genes coding for metabolic enzymes might be amplified in yeast if there is a growth advantage. This study indicates that on-demand gene amplification might be an efficient strategy for yeast metabolic engineering. In order to enable rapid and efficient fermentation of cellulosic hydrolysates by engineered yeast, we delve into the limiting factors of cellobiose fermentation by engineered yeast expressing a cellobiose transporter (encoded by cdt-1) and an intracellular β-glucosidase (encoded by gh1-1). Through laboratory evolution, we isolated mutant strains capable of fermenting cellobiose much faster than a parental strain. Genome sequencing of the fast cellobiose-fermenting mutant reveals that there are massive amplifications of cdt-1 and gh1-1 in the yeast genome. We also found positive and quantitative relationships between the rates of cellobiose consumption and the copy numbers of cdt-1 and gh1-1 in the evolved strains. Our results suggest that the cellobiose assimilation pathway (transport and hydrolysis) might be a rate-limiting step for efficient cellobiose fermentation. We demonstrate the feasibility of optimizing not only heterologous metabolic pathways in yeast through laboratory evolution but also on-demand gene amplification in yeast, which can be broadly applicable for metabolic engineering. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
NASA Technical Reports Server (NTRS)
Bidwell, Colin S.; Papadakis, Michael
2005-01-01
Collection efficiency and ice accretion calculations have been made for a series of business jet horizontal tail configurations using a three-dimensional panel code, an adaptive grid code, and the NASA Glenn LEWICE3D grid based ice accretion code. The horizontal tail models included two full scale wing tips and a 25 percent scale model. Flow solutions for the horizontal tails were generated using the PMARC panel code. Grids used in the ice accretion calculations were generated using the adaptive grid code ICEGRID. The LEWICE3D grid based ice accretion program was used to calculate impingement efficiency and ice shapes. Ice shapes typifying rime and mixed icing conditions were generated for a 30 minute hold condition. All calculations were performed on an SGI Octane computer. The results have been compared to experimental flow and impingement data. In general, the calculated flow and collection efficiencies compared well with experiment, and the ice shapes appeared representative of the rime and mixed icing conditions for which they were calculated.
Sparse/DCT (S/DCT) two-layered representation of prediction residuals for video coding.
Kang, Je-Won; Gabbouj, Moncef; Kuo, C-C Jay
2013-07-01
In this paper, we propose a cascaded sparse/DCT (S/DCT) two-layer representation of prediction residuals, and implement this idea on top of the state-of-the-art high efficiency video coding (HEVC) standard. First, a dictionary is adaptively trained to contain featured patterns of residual signals so that a high portion of energy in a structured residual can be efficiently coded via sparse coding. It is observed that the sparse representation alone is less effective in the R-D performance due to the side information overhead at higher bit rates. To overcome this problem, the DCT representation is cascaded at the second stage. It is applied to the remaining signal to improve coding efficiency. The two representations successfully complement each other. It is demonstrated by experimental results that the proposed algorithm outperforms the HEVC reference codec HM5.0 in the Common Test Condition.
An efficient HZETRN (a galactic cosmic ray transport code)
NASA Technical Reports Server (NTRS)
Shinn, Judy L.; Wilson, John W.
1992-01-01
An accurate and efficient engineering code for analyzing the shielding requirements against the high-energy galactic heavy ions is needed. The HZETRN is a deterministic code developed at Langley Research Center that is constantly under improvement both in physics and numerical computation and is targeted for such use. One problem area connected with the space-marching technique used in this code is the propagation of the local truncation error. By improving the numerical algorithms for interpolation, integration, and grid distribution formula, the efficiency of the code is increased by a factor of eight as the number of energy grid points is reduced. The numerical accuracy of better than 2 percent for a shield thickness of 150 g/cm(exp 2) is found when a 45 point energy grid is used. The propagating step size, which is related to the perturbation theory, is also reevaluated.
A concatenated coding scheme for error control
NASA Technical Reports Server (NTRS)
Kasami, T.; Fujiwara, T.; Lin, S.
1986-01-01
In this paper, a concatenated coding scheme for error control in data communications is presented and analyzed. In this scheme, the inner code is used for both error correction and detection; however, the outer code is used only for error detection. A retransmission is requested if either the inner code decoder fails to make a successful decoding or the outer code decoder detects the presence of errors after the inner code decoding. Probability of undetected error (or decoding error) of the proposed scheme is derived. An efficient method for computing this probability is presented. Throughput efficiency of the proposed error control scheme incorporated with a selective-repeat ARQ retransmission strategy is also analyzed. Three specific examples are presented. One of the examples is proposed for error control in the NASA Telecommand System.
NASA Technical Reports Server (NTRS)
Bonhaus, Daryl L.; Wornom, Stephen F.
1991-01-01
Two codes which solve the 3-D Thin Layer Navier-Stokes (TLNS) equations are used to compute the steady state flow for two test cases representing typical finite wings at transonic conditions. Several grids of C-O topology and varying point densities are used to determine the effects of grid refinement. After a description of each code and test case, standards for determining code efficiency and accuracy are defined and applied to determine the relative performance of the two codes in predicting turbulent transonic wing flows. Comparisons of computed surface pressure distributions with experimental data are made.
Synaptic E-I Balance Underlies Efficient Neural Coding.
Zhou, Shanglin; Yu, Yuguo
2018-01-01
Both theoretical and experimental evidence indicate that synaptic excitation and inhibition in the cerebral cortex are well-balanced during the resting state and sensory processing. Here, we briefly summarize the evidence for how neural circuits are adjusted to achieve this balance. Then, we discuss how such excitatory and inhibitory balance shapes stimulus representation and information propagation, two basic functions of neural coding. We also point out the benefit of adopting such a balance during neural coding. We conclude that excitatory and inhibitory balance may be a fundamental mechanism underlying efficient coding.
Synaptic E-I Balance Underlies Efficient Neural Coding
Zhou, Shanglin; Yu, Yuguo
2018-01-01
Both theoretical and experimental evidence indicate that synaptic excitation and inhibition in the cerebral cortex are well-balanced during the resting state and sensory processing. Here, we briefly summarize the evidence for how neural circuits are adjusted to achieve this balance. Then, we discuss how such excitatory and inhibitory balance shapes stimulus representation and information propagation, two basic functions of neural coding. We also point out the benefit of adopting such a balance during neural coding. We conclude that excitatory and inhibitory balance may be a fundamental mechanism underlying efficient coding. PMID:29456491
Fiedler, Jan; Baker, Andrew H; Dimmeler, Stefanie; Heymans, Stephane; Mayr, Manuel; Thum, Thomas
2018-05-23
Non-coding RNAs are increasingly recognized not only as regulators of various biological functions but also as targets for a new generation of RNA therapeutics and biomarkers. We hereby review recent insights relating to non-coding RNAs including microRNAs (e.g. miR-126, miR-146a), long non-coding RNAs (e.g. MIR503HG, GATA6-AS, SMILR) and circular RNAs (e.g. cZNF292) and their role in vascular diseases. This includes identification and therapeutic use of hypoxia-regulated non-coding RNAs and endogenous non-coding RNAs that regulate intrinsic smooth muscle cell signalling, age-related non-coding RNAs and non-coding RNAs involved in the regulation of mitochondrial biology and metabolic control. Finally, we discuss non-coding RNA species with biomarker potential.
High rate concatenated coding systems using bandwidth efficient trellis inner codes
NASA Technical Reports Server (NTRS)
Deng, Robert H.; Costello, Daniel J., Jr.
1989-01-01
High-rate concatenated coding systems with bandwidth-efficient trellis inner codes and Reed-Solomon (RS) outer codes are investigated for application in high-speed satellite communication systems. Two concatenated coding schemes are proposed. In one the inner code is decoded with soft-decision Viterbi decoding, and the outer RS code performs error-correction-only decoding (decoding without side information). In the other, the inner code is decoded with a modified Viterbi algorithm, which produces reliability information along with the decoded output. In this algorithm, path metrics are used to estimate the entire information sequence, whereas branch metrics are used to provide reliability information on the decoded sequence. This information is used to erase unreliable bits in the decoded output. An errors-and-erasures RS decoder is then used for the outer code. The two schemes have been proposed for high-speed data communication on NASA satellite channels. The rates considered are at least double those used in current NASA systems, and the results indicate that high system reliability can still be achieved.
Sajjad, Muhammad; Mehmood, Irfan; Baik, Sung Wook
2017-01-01
Medical image collections contain a wealth of information which can assist radiologists and medical experts in diagnosis and disease detection for making well-informed decisions. However, this objective can only be realized if efficient access is provided to semantically relevant cases from the ever-growing medical image repositories. In this paper, we present an efficient method for representing medical images by incorporating visual saliency and deep features obtained from a fine-tuned convolutional neural network (CNN) pre-trained on natural images. Saliency detector is employed to automatically identify regions of interest like tumors, fractures, and calcified spots in images prior to feature extraction. Neuronal activation features termed as neural codes from different CNN layers are comprehensively studied to identify most appropriate features for representing radiographs. This study revealed that neural codes from the last fully connected layer of the fine-tuned CNN are found to be the most suitable for representing medical images. The neural codes extracted from the entire image and salient part of the image are fused to obtain the saliency-injected neural codes (SiNC) descriptor which is used for indexing and retrieval. Finally, locality sensitive hashing techniques are applied on the SiNC descriptor to acquire short binary codes for allowing efficient retrieval in large scale image collections. Comprehensive experimental evaluations on the radiology images dataset reveal that the proposed framework achieves high retrieval accuracy and efficiency for scalable image retrieval applications and compares favorably with existing approaches. PMID:28771497
A method of evaluating efficiency during space-suited work in a neutral buoyancy environment
NASA Technical Reports Server (NTRS)
Greenisen, Michael C.; West, Phillip; Newton, Frederick K.; Gilbert, John H.; Squires, William G.
1991-01-01
The purpose was to investigate efficiency as related to the work transmission and the metabolic cost of various extravehicular activity (EVA) tasks during simulated microgravity (whole body water immersion) using three space suits. Two new prototype space station suits, AX-5 and MKIII, are pressurized at 57.2 kPa and were tested concurrently with the operationally used 29.6 kPa shuttle suit. Four male astronauts were asked to perform a fatigue trial on four upper extremity exercises during which metabolic rate and work output were measured and efficiency was calculated in each suit. The activities were selected to simulate actual EVA tasks. The test article was an underwater dynamometry system to which the astronauts were secured by foot restraints. All metabolic data was acquired, calculated, and stored using a computerized indirect calorimetry system connected to the suit ventilation/gas supply control console. During the efficiency testing, steady state metabolic rate could be evaluated as well as work transmitted to the dynamometer. Mechanical efficiency could then be calculated for each astronaut in each suit performing each movement.
Non-coding RNAs—Novel targets in neurotoxicity
Tal, Tamara L.; Tanguay, Robert L.
2012-01-01
Over the past ten years non-coding RNAs (ncRNAs) have emerged as pivotal players in fundamental physiological and cellular processes and have been increasingly implicated in cancer, immune disorders, and cardiovascular, neurodegenerative, and metabolic diseases. MicroRNAs (miRNAs) represent a class of ncRNA molecules that function as negative regulators of post-transcriptional gene expression. miRNAs are predicted to regulate 60% of all human protein-coding genes and as such, play key roles in cellular and developmental processes, human health, and disease. Relative to counterparts that lack bindings sites for miRNAs, genes encoding proteins that are post-transcriptionally regulated by miRNAs are twice as likely to be sensitive to environmental chemical exposure. Not surprisingly, miRNAs have been recognized as targets or effectors of nervous system, developmental, hepatic, and carcinogenic toxicants, and have been identified as putative regulators of phase I xenobiotic-metabolizing enzymes. In this review, we give an overview of the types of ncRNAs and highlight their roles in neurodevelopment, neurological disease, activity-dependent signaling, and drug metabolism. We then delve into specific examples that illustrate their importance as mediators, effectors, or adaptive agents of neurotoxicants or neuroactive pharmaceutical compounds. Finally, we identify a number of outstanding questions regarding ncRNAs and neurotoxicity. PMID:22394481
Design and implementation of H.264 based embedded video coding technology
NASA Astrophysics Data System (ADS)
Mao, Jian; Liu, Jinming; Zhang, Jiemin
2016-03-01
In this paper, an embedded system for remote online video monitoring was designed and developed to capture and record the real-time circumstances in elevator. For the purpose of improving the efficiency of video acquisition and processing, the system selected Samsung S5PV210 chip as the core processor which Integrated graphics processing unit. And the video was encoded with H.264 format for storage and transmission efficiently. Based on S5PV210 chip, the hardware video coding technology was researched, which was more efficient than software coding. After running test, it had been proved that the hardware video coding technology could obviously reduce the cost of system and obtain the more smooth video display. It can be widely applied for the security supervision [1].
75 FR 20833 - Building Energy Codes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-21
... DEPARTMENT OF ENERGY Office of Energy Efficiency and Renewable Energy [Docket No. EERE-2010-BT-BC-0012] Building Energy Codes AGENCY: Office of Energy Efficiency and Renewable Energy, Department of Energy. ACTION: Request for Information. SUMMARY: The U.S. Department of Energy (DOE) is soliciting...
Hybrid services efficient provisioning over the network coding-enabled elastic optical networks
NASA Astrophysics Data System (ADS)
Wang, Xin; Gu, Rentao; Ji, Yuefeng; Kavehrad, Mohsen
2017-03-01
As a variety of services have emerged, hybrid services have become more common in real optical networks. Although the elastic spectrum resource optimizations over the elastic optical networks (EONs) have been widely investigated, little research has been carried out on the hybrid services of the routing and spectrum allocation (RSA), especially over the network coding-enabled EON. We investigated the RSA for the unicast service and network coding-based multicast service over the network coding-enabled EON with the constraints of time delay and transmission distance. To address this issue, a mathematical model was built to minimize the total spectrum consumption for the hybrid services over the network coding-enabled EON under the constraints of time delay and transmission distance. The model guarantees different routing constraints for different types of services. The immediate nodes over the network coding-enabled EON are assumed to be capable of encoding the flows for different kinds of information. We proposed an efficient heuristic algorithm of the network coding-based adaptive routing and layered graph-based spectrum allocation algorithm (NCAR-LGSA). From the simulation results, NCAR-LGSA shows highly efficient performances in terms of the spectrum resources utilization under different network scenarios compared with the benchmark algorithms.
McCoy, Thomas H; Castro, Victor M; Snapper, Leslie A; Hart, Kamber L; Perlis, Roy H
2017-08-31
Biobanks and national registries represent a powerful tool for genomic discovery, but rely on diagnostic codes that may be unreliable and fail to capture the relationship between related diagnoses. We developed an efficient means of conducting genome-wide association studies using combinations of diagnostic codes from electronic health records (EHR) for 10845 participants in a biobanking program at two large academic medical centers. Specifically, we applied latent Dirichilet allocation to fit 50 disease topics based on diagnostic codes, then conducted genome-wide common-variant association for each topic. In sensitivity analysis, these results were contrasted with those obtained from traditional single-diagnosis phenome-wide association analysis, as well as those in which only a subset of diagnostic codes are included per topic. In meta-analysis across three biobank cohorts, we identified 23 disease-associated loci with p<1e-15, including previously associated autoimmune disease loci. In all cases, observed significant associations were of greater magnitude than for single phenome-wide diagnostic codes, and incorporation of less strongly-loading diagnostic codes enhanced association. This strategy provides a more efficient means of phenome-wide association in biobanks with coded clinical data.
McCoy, Thomas H; Castro, Victor M; Snapper, Leslie A; Hart, Kamber L; Perlis, Roy H
2017-01-01
Biobanks and national registries represent a powerful tool for genomic discovery, but rely on diagnostic codes that can be unreliable and fail to capture relationships between related diagnoses. We developed an efficient means of conducting genome-wide association studies using combinations of diagnostic codes from electronic health records for 10,845 participants in a biobanking program at two large academic medical centers. Specifically, we applied latent Dirichilet allocation to fit 50 disease topics based on diagnostic codes, then conducted a genome-wide common-variant association for each topic. In sensitivity analysis, these results were contrasted with those obtained from traditional single-diagnosis phenome-wide association analysis, as well as those in which only a subset of diagnostic codes were included per topic. In meta-analysis across three biobank cohorts, we identified 23 disease-associated loci with p < 1e-15, including previously associated autoimmune disease loci. In all cases, observed significant associations were of greater magnitude than single phenome-wide diagnostic codes, and incorporation of less strongly loading diagnostic codes enhanced association. This strategy provides a more efficient means of identifying phenome-wide associations in biobanks with coded clinical data. PMID:28861588
Peterson, Linda R; Herrero, Pilar; Schechtman, Kenneth B; Racette, Susan B; Waggoner, Alan D; Kisrieva-Ware, Zulia; Dence, Carmen; Klein, Samuel; Marsala, JoAnn; Meyer, Timothy; Gropler, Robert J
2004-05-11
Obesity is a risk factor for impaired cardiac performance, particularly in women. Animal studies suggest that alterations in myocardial fatty acid metabolism and efficiency in obesity can cause decreased cardiac performance. In the present study, we tested the hypothesis that myocardial fatty acid metabolism and efficiency are abnormal in obese women. We studied 31 young women (body mass index [BMI] 19 to 52 kg/m2); 19 were obese (BMI >30 kg/m2). Myocardial oxygen consumption (MVO2) and fatty acid uptake (MFAUp), utilization (MFAU), and oxidation (MFAO) were quantified by positron emission tomography. Cardiac work was measured by echocardiography, and efficiency was calculated as work/MVO2. BMI correlated with MVO2 (r=0.58, P=0.0006), MFAUp (r=0.42, P<0.05), and efficiency (r=-0.40, P<0.05). Insulin resistance, quantified by the glucose area under the curve (AUC) during an oral glucose tolerance test, correlated with MFAUp (r=0.55, P<0.005), MFAU (r=0.62, P<0.001), and MFAO (r=0.58, P<0.005). A multivariate, stepwise regression analysis showed that BMI was the only independent predictor of MVO2 and efficiency (P=0.0005 and P<0.05, respectively). Glucose AUC was the only independent predictor of MFAUp, MFAU, and MFAO (P<0.05, <0.005, and <0.005, respectively). In young women, obesity is a significant predictor of increased MVO2 and decreased efficiency, and insulin resistance is a robust predictor of MFAUp, MFAU, and MFAO. This increase in fatty acid metabolism and decrease in efficiency is concordant with observations made in experimental models of obesity. These metabolic changes may play a role in the pathogenesis of decreased cardiac performance in obese women.
Contribution of three CYP3A isoforms to metabolism of R- and S-warfarin.
Jones, Drew R; Kim, So-Young; Boysen, Gunnar; Yun, Chul-Ho; Miller, Grover P
2010-12-01
Effective coumadin (R/S-warfarin) therapy is complicated by inter-individual variability in metabolism. Recent studies have demonstrated that CYP3A isoforms likely contribute to patient responses and clinical outcomes. Despite a significant focus on CYP3A4, little is known about CYP3A5 and CYP3A7 metabolism of warfarin. Based on our studies, recombinant CYP3A4, CYP3A5 and CYP3A7 metabolized R- and S-warfarin to 10- and 4'-hydroxywarfarin with efficiencies that depended on the individual enzymes. For R-warfarin, CYP3A4, CYP3A7, and CYP3A5 demonstrated decreasing preference for 10-hydroxylation over 4'-hydroxylation. By contrast, there was no regioselectivity toward S-warfarin. While all enzymes preferentially metabolized R-warfarin, CYP3A4 was the most efficient at metabolizing all reactions. Individuals, namely African-Americans and children, with higher relative levels of CYP3A5 and/or CYP3A7, respectively, compared to CYP3A4 may metabolize warfarin less efficiently and thus may require lower doses and be at risk for adverse drug-drug interactions related to the contributions of the respective enzymes.
Multi-stage decoding for multi-level block modulation codes
NASA Technical Reports Server (NTRS)
Lin, Shu; Kasami, Tadao
1991-01-01
Various types of multistage decoding for multilevel block modulation codes, in which the decoding of a component code at each stage can be either soft decision or hard decision, maximum likelihood or bounded distance are discussed. Error performance for codes is analyzed for a memoryless additive channel based on various types of multi-stage decoding, and upper bounds on the probability of an incorrect decoding are derived. It was found that, if component codes of a multi-level modulation code and types of decoding at various stages are chosen properly, high spectral efficiency and large coding gain can be achieved with reduced decoding complexity. It was found that the difference in performance between the suboptimum multi-stage soft decision maximum likelihood decoding of a modulation code and the single stage optimum decoding of the overall code is very small, only a fraction of dB loss in SNR at the probability of an incorrect decoding for a block of 10(exp -6). Multi-stage decoding of multi-level modulation codes really offers a way to achieve the best of three worlds, bandwidth efficiency, coding gain, and decoding complexity.
Guo, Guangyu; Li, Ning
2011-07-01
In the quantitative proteomic studies, numerous in vitro and in vivo peptide labeling strategies have been successfully applied to measure differentially regulated protein and peptide abundance. These approaches have been proven to be versatile and repeatable in biological discoveries. (15)N metabolic labeling is one of these widely adopted and economical methods. However, due to the differential incorporation rates of (15)N or (14)N, the labeling results produce imperfectly matched isotopic envelopes between the heavy and light nitrogen-labeled peptides. In the present study, we have modified the solid Arabidopsis growth medium to standardize the (15)N supply, which led to a uniform incorporation of (15)N into the whole plant protein complement. The incorporation rate (97.43±0.11%) of (15)N into (15)N-coded peptides was determined by correlating the intensities of peptide ions with the labeling efficiencies according to Gaussian distribution. The resulting actual incorporation rate (97.44%) and natural abundance of (15)N/(14)N-coded peptides are used to re-calculate the intensities of isotopic envelopes of differentially labeled peptides, respectively. A modified (15)N/(14)N stable isotope labeling strategy, SILIA, is assessed and the results demonstrate that this approach is able to differentiate the fold change in protein abundance down to 10%. The machine dynamic range limitation and purification step will make the precursor ion ratio deriving from the actual ratio fold change. It is suggested that the differentially mixed (15)N-coded and (14)N-coded plant protein samples that are used to establish the protein abundance standard curve should be prepared following a similar protein isolation protocol used to isolate the proteins to be quantitated. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.
Menendez, Javier A
2015-01-01
The current global portfolio of oncology drugs is unlikely to produce durable disease remission for millions of cancer patients worldwide. This is due, in part, to the existence of so-called cancer stem cells (CSCs), a particularly aggressive type of malignant cell that is capable of indefinite self-replication, is refractory to conventional treatments, and is skilled at spreading and colonizing distant organs. To date, no drugs from big-league Pharma companies are capable of killing CSCs. Why? Quite simply, a classic drug development approach based on mutated genes and pathological protein products cannot efficiently target the plastic, epigenetic proclivity of cancer tissues to generate CSCs. Recent studies have proposed that certain elite metabolites (oncometabolites) and other common metabolites can significantly influence the establishment and maintenance of epigenetic signatures of stemness and cancer. Consequently, cellular metabolism and the core epigenetic codes, DNA methylation and histone modification, can be better viewed as an integrated metaboloepigenetic dimension of CSCs, which we have recently termed cancer metabostemness. By targeting weaknesses in the bridge connecting metabolism and epigenetics, a new generation of metabostemnessspecific drugs can be generated for potent and long-lasting elimination of life-threatening CSCs. Here I evaluate the market potential of re-modeling the oncology drug pipeline by discovering and developing new metabolic approaches able to target the apparently undruggable epigenetic programs that dynamically regulate the plasticity of non-CSC and CSC cellular states.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1991-01-01
Shannon's capacity bound shows that coding can achieve large reductions in the required signal to noise ratio per information bit (E sub b/N sub 0 where E sub b is the energy per bit and (N sub 0)/2 is the double sided noise density) in comparison to uncoded schemes. For bandwidth efficiencies of 2 bit/sym or greater, these improvements were obtained through the use of Trellis Coded Modulation and Block Coded Modulation. A method of obtaining these high efficiencies using multidimensional Multiple Phase Shift Keying (MPSK) and Quadrature Amplitude Modulation (QAM) signal sets with trellis coding is described. These schemes have advantages in decoding speed, phase transparency, and coding gain in comparison to other trellis coding schemes. Finally, a general parity check equation for rotationally invariant trellis codes is introduced from which non-linear codes for two dimensional MPSK and QAM signal sets are found. These codes are fully transparent to all rotations of the signal set.
NASA Technical Reports Server (NTRS)
Lee, P. J.
1984-01-01
For rate 1/N convolutional codes, a recursive algorithm for finding the transfer function bound on bit error rate (BER) at the output of a Viterbi decoder is described. This technique is very fast and requires very little storage since all the unnecessary operations are eliminated. Using this technique, we find and plot bounds on the BER performance of known codes of rate 1/2 with K 18, rate 1/3 with K 14. When more than one reported code with the same parameter is known, we select the code that minimizes the required signal to noise ratio for a desired bit error rate of 0.000001. This criterion of determining goodness of a code had previously been found to be more useful than the maximum free distance criterion and was used in the code search procedures of very short constraint length codes. This very efficient technique can also be used for searches of longer constraint length codes.
Yang, Hui; Huang, Xiaochang; Fang, Shaoming; He, Maozhang; Zhao, Yuanzhang; Wu, Zhenfang; Yang, Ming; Zhang, Zhiyan; Chen, Congying; Huang, Lusheng
2017-01-01
Gut microbiota plays fundamental roles in energy harvest, nutrient digestion, and intestinal health, especially in processing indigestible components of polysaccharides in diet. Unraveling the microbial taxa and functional capacity of gut microbiome associated with feed efficiency can provide important knowledge to improve pig feed efficiency in swine industry. In the current research, we studied the association of fecal microbiota with feed efficiency in 280 commercial Duroc pigs. All experimental pigs could be clustered into two enterotype-like groups. Different enterotypes showed the tendency of association with the feed efficiency (P = 0.07). We further identified 31 operational taxonomic units (OTUs) showing the potential associations with porcine feed efficiency. These OTUs were mainly annotated to the bacteria related to the metabolisms of dietary polysaccharides. Although we did not identify the RFI-associated bacterial species at FDR < 0.05 level, metagenomic sequencing analysis did find the distinct function capacities of gut microbiome between the high and low RFI pigs (FDR < 0.05). The KEGG orthologies related to nitrogen metabolism, amino acid metabolism, and transport system, and eight KEGG pathways including glycine, serine, and threonine metabolism were positively associated with porcine feed efficiency. We inferred that gut microbiota might improve porcine feed efficiency through promoting intestinal health by the SCFAs produced by fermenting dietary polysaccharides and improving the utilization of dietary protein. The present results provided important basic knowledge for improving porcine feed efficiency through modulating gut microbiome. PMID:28861066
Overproduction of Geranylgeraniol by Metabolically Engineered Saccharomyces cerevisiae▿
Tokuhiro, Kenro; Muramatsu, Masayoshi; Ohto, Chikara; Kawaguchi, Toshiya; Obata, Shusei; Muramoto, Nobuhiko; Hirai, Masana; Takahashi, Haruo; Kondo, Akihiko; Sakuradani, Eiji; Shimizu, Sakayu
2009-01-01
(E, E, E)-Geranylgeraniol (GGOH) is a valuable starting material for perfumes and pharmaceutical products. In the yeast Saccharomyces cerevisiae, GGOH is synthesized from the end products of the mevalonate pathway through the sequential reactions of farnesyl diphosphate synthetase (encoded by the ERG20 gene), geranylgeranyl diphosphate synthase (the BTS1 gene), and some endogenous phosphatases. We demonstrated that overexpression of the diacylglycerol diphosphate phosphatase (DPP1) gene could promote GGOH production. We also found that overexpression of a BTS1-DPP1 fusion gene was more efficient for producing GGOH than coexpression of these genes separately. Overexpression of the hydroxymethylglutaryl-coenzyme A reductase (HMG1) gene, which encodes the major rate-limiting enzyme of the mevalonate pathway, resulted in overproduction of squalene (191.9 mg liter−1) rather than GGOH (0.2 mg liter−1) in test tube cultures. Coexpression of the BTS1-DPP1 fusion gene along with the HMG1 gene partially redirected the metabolic flux from squalene to GGOH. Additional expression of a BTS1-ERG20 fusion gene resulted in an almost complete shift of the flux to GGOH production (228.8 mg liter−1 GGOH and 6.5 mg liter−1 squalene). Finally, we constructed a diploid prototrophic strain coexpressing the HMG1, BTS1-DPP1, and BTS1-ERG20 genes from multicopy integration vectors. This strain attained 3.31 g liter−1 GGOH production in a 10-liter jar fermentor with gradual feeding of a mixed glucose and ethanol solution. The use of bifunctional fusion genes such as the BTS1-DPP1 and ERG20-BTS1 genes that code sequential enzymes in the metabolic pathway was an effective method for metabolic engineering. PMID:19592534
2010-01-01
Background Natural accessions of Arabidopsis thaliana are characterized by a high level of phenotypic variation that can be used to investigate the extent and mode of selection on the primary metabolic traits. A collection of 54 A. thaliana natural accession-derived lines were subjected to deep genotyping through Single Feature Polymorphism (SFP) detection via genomic DNA hybridization to Arabidopsis Tiling 1.0 Arrays for the detection of selective sweeps, and identification of associations between sweep regions and growth-related metabolic traits. Results A total of 1,072,557 high-quality SFPs were detected and indications for 3,943 deletions and 1,007 duplications were obtained. A significantly lower than expected SFP frequency was observed in protein-, rRNA-, and tRNA-coding regions and in non-repetitive intergenic regions, while pseudogenes, transposons, and non-coding RNA genes are enriched with SFPs. Gene families involved in plant defence or in signalling were identified as highly polymorphic, while several other families including transcription factors are depleted of SFPs. 198 significant associations between metabolic genes and 9 metabolic and growth-related phenotypic traits were detected with annotation hinting at the nature of the relationship. Five significant selective sweep regions were also detected of which one associated significantly with a metabolic trait. Conclusions We generated a high density polymorphism map for 54 A. thaliana accessions that highlights the variability of resistance genes across geographic ranges and used it to identify selective sweeps and associations between metabolic genes and metabolic phenotypes. Several associations show a clear biological relationship, while many remain requiring further investigation. PMID:20302660
Coding visual features extracted from video sequences.
Baroffio, Luca; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano
2014-05-01
Visual features are successfully exploited in several applications (e.g., visual search, object recognition and tracking, etc.) due to their ability to efficiently represent image content. Several visual analysis tasks require features to be transmitted over a bandwidth-limited network, thus calling for coding techniques to reduce the required bit budget, while attaining a target level of efficiency. In this paper, we propose, for the first time, a coding architecture designed for local features (e.g., SIFT, SURF) extracted from video sequences. To achieve high coding efficiency, we exploit both spatial and temporal redundancy by means of intraframe and interframe coding modes. In addition, we propose a coding mode decision based on rate-distortion optimization. The proposed coding scheme can be conveniently adopted to implement the analyze-then-compress (ATC) paradigm in the context of visual sensor networks. That is, sets of visual features are extracted from video frames, encoded at remote nodes, and finally transmitted to a central controller that performs visual analysis. This is in contrast to the traditional compress-then-analyze (CTA) paradigm, in which video sequences acquired at a node are compressed and then sent to a central unit for further processing. In this paper, we compare these coding paradigms using metrics that are routinely adopted to evaluate the suitability of visual features in the context of content-based retrieval, object recognition, and tracking. Experimental results demonstrate that, thanks to the significant coding gains achieved by the proposed coding scheme, ATC outperforms CTA with respect to all evaluation metrics.
Vector processing efficiency of plasma MHD codes by use of the FACOM 230-75 APU
NASA Astrophysics Data System (ADS)
Matsuura, T.; Tanaka, Y.; Naraoka, K.; Takizuka, T.; Tsunematsu, T.; Tokuda, S.; Azumi, M.; Kurita, G.; Takeda, T.
1982-06-01
In the framework of pipelined vector architecture, the efficiency of vector processing is assessed with respect to plasma MHD codes in nuclear fusion research. By using a vector processor, the FACOM 230-75 APU, the limit of the enhancement factor due to parallelism of current vector machines is examined for three numerical codes based on a fluid model. Reasonable speed-up factors of approximately 6,6 and 4 times faster than the highly optimized scalar version are obtained for ERATO (linear stability code), AEOLUS-R1 (nonlinear stability code) and APOLLO (1-1/2D transport code), respectively. Problems of the pipelined vector processors are discussed from the viewpoint of restructuring, optimization and choice of algorithms. In conclusion, the important concept of "concurrency within pipelined parallelism" is emphasized.
Xiang, Hong; Lü, Xi-Wu; Yang, Fei; Yin, Li-Hong; Zhu, Guang-Can
2011-04-01
In order to explore characteristics of microbial community and operation efficiency in biofilter (biologically-enhanced active filter and biological activated carbon filter) process for drinking water purification, Biolog and polymerase chain reaction-single strand conformation polymorphism (PCR-SSCP) techniques were applied to analyze the metabolic function and structure of microbial community developing in biofilters. Water quality parameters, such as NH; -N, NO; -N, permanganate index, UV254 and BDOC etc, were determined in inflow and outflow of biofilters for investigation of operation efficiency of the biofilters. The results show that metabolic capacity of microbial community of the raw water is reduced after the biofilters, which reflect that metabolically active microbial communities in the raw water can be intercepted by biofilters. After 6 months operation of biofilters, the metabolic profiles of microbial communities are similar between two kinds of biologically-enhanced active filters, and utilization of carbon sources of microbial communities in the two filters are 73.4% and 75.5%, respectively. The metabolic profiles of microbial communities in two biological activated carbon filters showed significant difference. The carbon source utilization rate of microbial community in granule-activated carbon filter is 79.6%, which is obviously higher than 53.8% of the rate in the columnar activated carbon filter (p < 0.01). The analysis results of PCR-SSCP indicate that microbial communities in each biofilter are variety, but the structure of dominant microorganisms is similar among different biofilters. The results also show that the packing materials had little effect on the structure and metabolic function of microbial community in biologically-enhanced active filters, and the difference between two biofilters for the water purification efficiency was not significant (p > 0.05). However, in biological activated carbon filters, granule-activated carbon is conducive to microbial growth and reproduction, and the microbial communities in the biofilter present high metabolic activities, and the removal efficiency for NH4(+)-N, permanganate index and BDOC is better than the columnar activated carbon filter(p < 0.05). The results also suggest that operation efficiency of biofilter is related to the metabolic capacity of microbial community in biofilter.
Silva, Angélica; Noronha, Henrique; Dai, Zhanwu; Delrot, Serge; Gerós, Hernâni
2017-09-01
Severe leaf removal decreases storage starch and sucrose in grapevine cv. Cabernet Sauvignon fruiting cuttings and modulates the activity of key enzymes and the expression of sugar transporter genes. Leaf removal is an agricultural practice that has been shown to modify vineyard efficiency and grape and wine composition. In this study, we took advantage of the ability to precisely control the number of leaves to fruits in Cabernet Sauvignon fruiting cuttings to study the effect of source-sink ratios (2 (2L), 6 (6L) and 12 (12) leaves per cluster) on starch metabolism and accumulation. Starch concentration was significantly higher in canes from 6L (42.13 ± 1.44 mg g DW -1 ) and 12L (43.50 ± 2.85 mg g DW -1 ) than in 2L (22.72 ± 3.10 mg g DW -1 ) plants. Moreover, carbon limitation promoted a transcriptional adjustment of genes involved in starch metabolism in grapevine woody tissues, including a decrease in the expression of the plastidic glucose-6-phosphate translocator, VvGPT1. Contrarily, the transcript levels of the gene coding the catalytic subunit VvAGPB1 of the VvAGPase complex were higher in canes from 2L plants than in 6L and 12L, which positively correlated with the biochemical activity of this enzyme. Sucrose concentration increased in canes from 2L to 6L and 12L plants, and the amount of total phenolics followed the same trend. Expression studies showed that VvSusy transcripts decreased in canes from 2L to 6L and 12L plants, which correlated with the biochemical activity of insoluble invertase, while the expression of the sugar transporters VvSUC11 and VvSUC12, together with VvSPS1, which codes an enzyme involved in sucrose synthesis, increased. Thus, sucrose seems to control starch accumulation through the adjustment of the cane sink strength.
USDA-ARS?s Scientific Manuscript database
Testosterone deficiency is associated with obesity in humans. It has been proven that long non-coding RNAs (lncRNAs) regulate adipose tissue metabolism; therefore, we first study the role of lncRNAs on testosterone deficiency-induced fat deposition using castrated male pigs as the model animal. The ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasin-Brumshtein, Yehudit; Khan, Arshad H.; Hormozdiari, Farhad
2016-09-13
Previous studies had shown that the integration of genome wide expression profiles, in metabolic tissues, with genetic and phenotypic variance, provided valuable insight into the underlying molecular mechanisms. We used RNA-Seq to characterize hypothalamic transcriptome in 99 inbred strains of mice from the Hybrid Mouse Diversity Panel (HMDP), a reference resource population for cardiovascular and metabolic traits. We report numerous novel transcripts supported by proteomic analyses, as well as novel non coding RNAs. High resolution genetic mapping of transcript levels in HMDP, reveals bothlocalandtransexpression Quantitative Trait Loci (eQTLs) demonstrating 2transeQTL 'hotspots' associated with expression of hundreds of genes. We alsomore » report thousands of alternative splicing events regulated by genetic variants. Finally, comparison with about 150 metabolic and cardiovascular traits revealed many highly significant associations. Our data provide a rich resource for understanding the many physiologic functions mediated by the hypothalamus and their genetic regulation.« less
An analysis of the metabolic theory of the origin of the genetic code
NASA Technical Reports Server (NTRS)
Amirnovin, R.; Bada, J. L. (Principal Investigator)
1997-01-01
A computer program was used to test Wong's coevolution theory of the genetic code. The codon correlations between the codons of biosynthetically related amino acids in the universal genetic code and in randomly generated genetic codes were compared. It was determined that many codon correlations are also present within random genetic codes and that among the random codes there are always several which have many more correlations than that found in the universal code. Although the number of correlations depends on the choice of biosynthetically related amino acids, the probability of choosing a random genetic code with the same or greater number of codon correlations as the universal genetic code was found to vary from 0.1% to 34% (with respect to a fairly complete listing of related amino acids). Thus, Wong's theory that the genetic code arose by coevolution with the biosynthetic pathways of amino acids, based on codon correlations between biosynthetically related amino acids, is statistical in nature.
NASA Technical Reports Server (NTRS)
Xiong, Fugin
2003-01-01
One half of Professor Xiong's effort will investigate robust timing synchronization schemes for dynamically varying characteristics of aviation communication channels. The other half of his time will focus on efficient modulation and coding study for the emerging quantum communications.
Yeast metabolic engineering for hemicellulosic ethanol production
Jennifer Van Vleet; Thomas W. Jeffries
2009-01-01
Efficient fermentation of hemicellulosic sugars is critical for the bioconversion of lignocellulosics to ethanol. Efficient sugar uptake through the heterologous expression of yeast and fungal xylose/glucose transporters can improve fermentation if other metabolic steps are not rate limiting. Rectification of cofactor imbalances through heterologous expression of...
Liu, Yangyang; Han, Xiao; Yuan, Junting; Geng, Tuoyu; Chen, Shihao; Hu, Xuming; Cui, Isabelle H; Cui, Hengmi
2017-04-07
The type II bacterial CRISPR/Cas9 system is a simple, convenient, and powerful tool for targeted gene editing. Here, we describe a CRISPR/Cas9-based approach for inserting a poly(A) transcriptional terminator into both alleles of a targeted gene to silence protein-coding and non-protein-coding genes, which often play key roles in gene regulation but are difficult to silence via insertion or deletion of short DNA fragments. The integration of 225 bp of bovine growth hormone poly(A) signals into either the first intron or the first exon or behind the promoter of target genes caused efficient termination of expression of PPP1R12C , NSUN2 (protein-coding genes), and MALAT1 (non-protein-coding gene). Both NeoR and PuroR were used as markers in the selection of clonal cell lines with biallelic integration of a poly(A) signal. Genotyping analysis indicated that the cell lines displayed the desired biallelic silencing after a brief selection period. These combined results indicate that this CRISPR/Cas9-based approach offers an easy, convenient, and efficient novel technique for gene silencing in cell lines, especially for those in which gene integration is difficult because of a low efficiency of homology-directed repair. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.
A bandwidth efficient coding scheme for the Hubble Space Telescope
NASA Technical Reports Server (NTRS)
Pietrobon, Steven S.; Costello, Daniel J., Jr.
1991-01-01
As a demonstration of the performance capabilities of trellis codes using multidimensional signal sets, a Viterbi decoder was designed. The choice of code was based on two factors. The first factor was its application as a possible replacement for the coding scheme currently used on the Hubble Space Telescope (HST). The HST at present uses the rate 1/3 nu = 6 (with 2 (exp nu) = 64 states) convolutional code with Binary Phase Shift Keying (BPSK) modulation. With the modulator restricted to a 3 Msym/s, this implies a data rate of only 1 Mbit/s, since the bandwidth efficiency K = 1/3 bit/sym. This is a very bandwidth inefficient scheme, although the system has the advantage of simplicity and large coding gain. The basic requirement from NASA was for a scheme that has as large a K as possible. Since a satellite channel was being used, 8PSK modulation was selected. This allows a K of between 2 and 3 bit/sym. The next influencing factor was INTELSAT's intention of transmitting the SONET 155.52 Mbit/s standard data rate over the 72 MHz transponders on its satellites. This requires a bandwidth efficiency of around 2.5 bit/sym. A Reed-Solomon block code is used as an outer code to give very low bit error rates (BER). A 16 state rate 5/6, 2.5 bit/sym, 4D-8PSK trellis code was selected. This code has reasonable complexity and has a coding gain of 4.8 dB compared to uncoded 8PSK (2). This trellis code also has the advantage that it is 45 deg rotationally invariant. This means that the decoder needs only to synchronize to one of the two naturally mapped 8PSK signals in the signal set.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckpitt, Alan, E-mail: arbuckpitt@ucdavis.edu; Morin, Dexter; Murphy, Shannon
Naphthalene produces species and cell selective injury to respiratory tract epithelial cells of rodents. In these studies we determined the apparent K{sub m}, V{sub max}, and catalytic efficiency (V{sub max}/K{sub m}) for naphthalene metabolism in microsomal preparations from subcompartments of the respiratory tract of rodents and non-human primates. In tissues with high substrate turnover, major metabolites were derived directly from naphthalene oxide with smaller amounts from conjugates of diol epoxide, diepoxide, and 1,2- and 1,4-naphthoquinones. In some tissues, different enzymes with dissimilar K{sub m} and V{sub max} appeared to metabolize naphthalene. The rank order of V{sub max} (rat olfactory epitheliummore » > mouse olfactory epithelium > murine airways ≫ rat airways) correlated well with tissue susceptibility to naphthalene. The V{sub max} in monkey alveolar subcompartment was 2% that in rat nasal olfactory epithelium. Rates of metabolism in nasal compartments of the monkey were low. The catalytic efficiencies of microsomes from known susceptible tissues/subcompartments are 10 and 250 fold higher than in rat airway and monkey alveolar subcompartments, respectively. Although the strong correlations between catalytic efficiencies and tissue susceptibility suggest that non-human primate tissues are unlikely to generate metabolites at a rate sufficient to produce cellular injury, other studies showing high levels of formation of protein adducts support the need for additional studies. - Highlights: • Naphthalene is metabolized with high catalytic efficiency in susceptible tissue. • Naphthalene is metabolized at low catalytic efficiency in non-susceptible tissue. • Respiratory tissues of the non human primate metabolize naphthalene slowly.« less
Alternative Formats to Achieve More Efficient Energy Codes for Commercial Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conover, David R.; Rosenberg, Michael I.; Halverson, Mark A.
2013-01-26
This paper identifies and examines several formats or structures that could be used to create the next generation of more efficient energy codes and standards for commercial buildings. Pacific Northwest National Laboratory (PNNL) is funded by the U.S. Department of Energy’s Building Energy Codes Program (BECP) to provide technical support to the development of ANSI/ASHRAE/IES Standard 90.1. While the majority of PNNL’s ASHRAE Standard 90.1 support focuses on developing and evaluating new requirements, a portion of its work involves consideration of the format of energy standards. In its current working plan, the ASHRAE 90.1 committee has approved an energy goalmore » of 50% improvement in Standard 90.1-2013 relative to Standard 90.1-2004, and will likely be considering higher improvement targets for future versions of the standard. To cost-effectively achieve the 50% goal in manner that can gain stakeholder consensus, formats other than prescriptive must be considered. Alternative formats that include reducing the reliance on prescriptive requirements may make it easier to achieve these aggressive efficiency levels in new codes and standards. The focus on energy code and standard formats is meant to explore approaches to presenting the criteria that will foster compliance, enhance verification, and stimulate innovation while saving energy in buildings. New formats may also make it easier for building designers and owners to design and build the levels of efficiency called for in the new codes and standards. This paper examines a number of potential formats and structures, including prescriptive, performance-based (with sub-formats of performance equivalency and performance targets), capacity constraint-based, and outcome-based. The paper also discusses the pros and cons of each format from the viewpoint of code users and of code enforcers.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-10-01
Huffman codes, comma-free codes, and block codes with shift indicators are important candidate-message compression codes for improving the efficiency of communications systems. This study was undertaken to determine if these codes could be used to increase the thruput of the fixed very-low-frequency (FVLF) communication system. This applications involves the use of compression codes in a channel with errors.
Exploiting the cannibalistic traits of Reed-Solomon codes
NASA Technical Reports Server (NTRS)
Collins, O.
1993-01-01
In Reed-Solomon codes and all other maximum distance separable codes, there is an intrinsic relationship between the size of the symbols in a codeword and the length of the codeword. Increasing the number of symbols in a codeword to improve the efficiency of the coding system thus requires using a larger set of symbols. However, long Reed-Solomon codes are difficult to implement and many communications or storage systems cannot easily accommodate an increased symbol size, e.g., M-ary frequency shift keying (FSK) and photon-counting pulse-position modulation demand a fixed symbol size. A technique for sharing redundancy among many different Reed-Solomon codewords to achieve the efficiency attainable in long Reed-Solomon codes without increasing the symbol size is described. Techniques both for calculating the performance of these new codes and for determining their encoder and decoder complexities is presented. These complexities are usually found to be substantially lower than conventional Reed-Solomon codes of similar performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexandrov, Boian S.; Lliev, Filip L.; Stanev, Valentin G.
This code is a toy (short) version of CODE-2016-83. From a general perspective, the code represents an unsupervised adaptive machine learning algorithm that allows efficient and high performance de-mixing and feature extraction of a multitude of non-negative signals mixed and recorded by a network of uncorrelated sensor arrays. The code identifies the number of the mixed original signals and their locations. Further, the code also allows deciphering of signals that have been delayed in regards to the mixing process in each sensor. This code is high customizable and it can be efficiently used for a fast macro-analyses of data. Themore » code is applicable to a plethora of distinct problems: chemical decomposition, pressure transient decomposition, unknown sources/signal allocation, EM signal decomposition. An additional procedure for allocation of the unknown sources is incorporated in the code.« less
Computation of Reacting Flows in Combustion Processes
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Chen, Kuo-Huey
1997-01-01
The main objective of this research was to develop an efficient three-dimensional computer code for chemically reacting flows. The main computer code developed is ALLSPD-3D. The ALLSPD-3D computer program is developed for the calculation of three-dimensional, chemically reacting flows with sprays. The ALL-SPD code employs a coupled, strongly implicit solution procedure for turbulent spray combustion flows. A stochastic droplet model and an efficient method for treatment of the spray source terms in the gas-phase equations are used to calculate the evaporating liquid sprays. The chemistry treatment in the code is general enough that an arbitrary number of reaction and species can be defined by the users. Also, it is written in generalized curvilinear coordinates with both multi-block and flexible internal blockage capabilities to handle complex geometries. In addition, for general industrial combustion applications, the code provides both dilution and transpiration cooling capabilities. The ALLSPD algorithm, which employs the preconditioning and eigenvalue rescaling techniques, is capable of providing efficient solution for flows with a wide range of Mach numbers. Although written for three-dimensional flows in general, the code can be used for two-dimensional and axisymmetric flow computations as well. The code is written in such a way that it can be run in various computer platforms (supercomputers, workstations and parallel processors) and the GUI (Graphical User Interface) should provide a user-friendly tool in setting up and running the code.
Random sampling of elementary flux modes in large-scale metabolic networks.
Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel
2012-09-15
The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.
de Castro Barbosa, Thais; Ingerslev, Lars R.; Alm, Petter S.; Versteyhe, Soetkin; Massart, Julie; Rasmussen, Morten; Donkin, Ida; Sjögren, Rasmus; Mudry, Jonathan M.; Vetterli, Laurène; Gupta, Shashank; Krook, Anna; Zierath, Juleen R.; Barrès, Romain
2015-01-01
Objectives Chronic and high consumption of fat constitutes an environmental stress that leads to metabolic diseases. We hypothesized that high-fat diet (HFD) transgenerationally remodels the epigenome of spermatozoa and metabolism of the offspring. Methods F0-male rats fed either HFD or chow diet for 12 weeks were mated with chow-fed dams to generate F1 and F2 offspring. Motile spermatozoa were isolated from F0 and F1 breeders to determine DNA methylation and small non-coding RNA (sncRNA) expression pattern by deep sequencing. Results Newborn offspring of HFD-fed fathers had reduced body weight and pancreatic beta-cell mass. Adult female, but not male, offspring of HFD-fed fathers were glucose intolerant and resistant to HFD-induced weight gain. This phenotype was perpetuated in the F2 progeny, indicating transgenerational epigenetic inheritance. The epigenome of spermatozoa from HFD-fed F0 and their F1 male offspring showed common DNA methylation and small non-coding RNA expression signatures. Altered expression of sperm miRNA let-7c was passed down to metabolic tissues of the offspring, inducing a transcriptomic shift of the let-7c predicted targets. Conclusion Our results provide insight into mechanisms by which HFD transgenerationally reprograms the epigenome of sperm cells, thereby affecting metabolic tissues of offspring throughout two generations. PMID:26977389
Increasing Flexibility in Energy Code Compliance: Performance Packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Philip R.; Rosenberg, Michael I.
Energy codes and standards have provided significant increases in building efficiency over the last 38 years, since the first national energy code was published in late 1975. The most commonly used path in energy codes, the prescriptive path, appears to be reaching a point of diminishing returns. As the code matures, the prescriptive path becomes more complicated, and also more restrictive. It is likely that an approach that considers the building as an integrated system will be necessary to achieve the next real gains in building efficiency. Performance code paths are increasing in popularity; however, there remains a significant designmore » team overhead in following the performance path, especially for smaller buildings. This paper focuses on development of one alternative format, prescriptive packages. A method to develop building-specific prescriptive packages is reviewed based on a multiple runs of prototypical building models that are used to develop parametric decision analysis to determines a set of packages with equivalent energy performance. The approach is designed to be cost-effective and flexible for the design team while achieving a desired level of energy efficiency performance. A demonstration of the approach based on mid-sized office buildings with two HVAC system types is shown along with a discussion of potential applicability in the energy code process.« less
Parallelization of Nullspace Algorithm for the computation of metabolic pathways
Jevremović, Dimitrije; Trinh, Cong T.; Srienc, Friedrich; Sosa, Carlos P.; Boley, Daniel
2011-01-01
Elementary mode analysis is a useful metabolic pathway analysis tool in understanding and analyzing cellular metabolism, since elementary modes can represent metabolic pathways with unique and minimal sets of enzyme-catalyzed reactions of a metabolic network under steady state conditions. However, computation of the elementary modes of a genome- scale metabolic network with 100–1000 reactions is very expensive and sometimes not feasible with the commonly used serial Nullspace Algorithm. In this work, we develop a distributed memory parallelization of the Nullspace Algorithm to handle efficiently the computation of the elementary modes of a large metabolic network. We give an implementation in C++ language with the support of MPI library functions for the parallel communication. Our proposed algorithm is accompanied with an analysis of the complexity and identification of major bottlenecks during computation of all possible pathways of a large metabolic network. The algorithm includes methods to achieve load balancing among the compute-nodes and specific communication patterns to reduce the communication overhead and improve efficiency. PMID:22058581
RNA- and protein-mediated control of Listeria monocytogenes virulence gene expression
Lebreton, Alice; Cossart, Pascale
2017-01-01
ABSTRACT The model opportunistic pathogen Listeria monocytogenes has been the object of extensive research, aiming at understanding its ability to colonize diverse environmental niches and animal hosts. Bacterial transcriptomes in various conditions reflect this efficient adaptability. We review here our current knowledge of the mechanisms allowing L. monocytogenes to respond to environmental changes and trigger pathogenicity, with a special focus on RNA-mediated control of gene expression. We highlight how these studies have brought novel concepts in prokaryotic gene regulation, such as the ‘excludon’ where the 5′-UTR of a messenger also acts as an antisense regulator of an operon transcribed in opposite orientation, or the notion that riboswitches can regulate non-coding RNAs to integrate complex metabolic stimuli into regulatory networks. Overall, the Listeria model exemplifies that fine RNA tuners act together with master regulatory proteins to orchestrate appropriate transcriptional programmes. PMID:27217337
OXSA: An open-source magnetic resonance spectroscopy analysis toolbox in MATLAB.
Purvis, Lucian A B; Clarke, William T; Biasiolli, Luca; Valkovič, Ladislav; Robson, Matthew D; Rodgers, Christopher T
2017-01-01
In vivo magnetic resonance spectroscopy provides insight into metabolism in the human body. New acquisition protocols are often proposed to improve the quality or efficiency of data collection. Processing pipelines must also be developed to use these data optimally. Current fitting software is either targeted at general spectroscopy fitting, or for specific protocols. We therefore introduce the MATLAB-based OXford Spectroscopy Analysis (OXSA) toolbox to allow researchers to rapidly develop their own customised processing pipelines. The toolbox aims to simplify development by: being easy to install and use; seamlessly importing Siemens Digital Imaging and Communications in Medicine (DICOM) standard data; allowing visualisation of spectroscopy data; offering a robust fitting routine; flexibly specifying prior knowledge when fitting; and allowing batch processing of spectra. This article demonstrates how each of these criteria have been fulfilled, and gives technical details about the implementation in MATLAB. The code is freely available to download from https://github.com/oxsatoolbox/oxsa.
Efficient Signal, Code, and Receiver Designs for MIMO Communication Systems
2003-06-01
167 5-31 Concatenation of a tilted-QAM inner code with an LDPC outer code with a two component iterative soft-decision decoder. . . . . . . . . 168 5...for AWGN channels has long been studied. There are well-known soft-decision codes like the turbo codes and LDPC codes that can approach capacity to...bits) low density parity check ( LDPC ) code 1. 2. The coded bits are randomly interleaved so that bits nearby go through different sub-channels, and are
Li, Fuyong
2017-01-01
ABSTRACT Exploring compositional and functional characteristics of the rumen microbiome can improve the understanding of its role in rumen function and cattle feed efficiency. In this study, we applied metatranscriptomics to characterize the active rumen microbiomes of beef cattle with different feed efficiencies (efficient, n = 10; inefficient, n = 10) using total RNA sequencing. Active bacterial and archaeal compositions were estimated based on 16S rRNAs, and active microbial metabolic functions including carbohydrate-active enzymes (CAZymes) were assessed based on mRNAs from the same metatranscriptomic data sets. In total, six bacterial phyla (Proteobacteria, Firmicutes, Bacteroidetes, Spirochaetes, Cyanobacteria, and Synergistetes), eight bacterial families (Succinivibrionaceae, Prevotellaceae, Ruminococcaceae, Lachnospiraceae, Veillonellaceae, Spirochaetaceae, Dethiosulfovibrionaceae, and Mogibacteriaceae), four archaeal clades (Methanomassiliicoccales, Methanobrevibacter ruminantium, Methanobrevibacter gottschalkii, and Methanosphaera), 112 metabolic pathways, and 126 CAZymes were identified as core components of the active rumen microbiome. As determined by comparative analysis, three bacterial families (Lachnospiraceae, Lactobacillaceae, and Veillonellaceae) tended to be more abundant in low-feed-efficiency (inefficient) animals (P < 0.10), and one archaeal taxon (Methanomassiliicoccales) tended to be more abundant in high-feed-efficiency (efficient) cattle (P < 0.10). Meanwhile, 32 microbial metabolic pathways and 12 CAZymes were differentially abundant (linear discriminant analysis score of >2 with a P value of <0.05) between two groups. Among them, 30 metabolic pathways and 11 CAZymes were more abundant in the rumen of inefficient cattle, while 2 metabolic pathways and 1 CAZyme were more abundant in efficient animals. These findings suggest that the rumen microbiomes of inefficient cattle have more diverse activities than those of efficient cattle, which may be related to the host feed efficiency variation. IMPORTANCE This study applied total RNA-based metatranscriptomics and showed the linkage between the active rumen microbiome and feed efficiency (residual feed intake) in beef cattle. The data generated from the current study provide fundamental information on active rumen microbiome at both compositional and functional levels, which serve as a foundation to study rumen function and its role in cattle feed efficiency. The findings that the active rumen microbiome may contribute to variations in feed efficiency of beef cattle highlight the possibility of enhancing nutrient utilization and improve cattle feed efficiency through modification of rumen microbial functions. PMID:28235871
Multi-stage decoding for multi-level block modulation codes
NASA Technical Reports Server (NTRS)
Lin, Shu
1991-01-01
In this paper, we investigate various types of multi-stage decoding for multi-level block modulation codes, in which the decoding of a component code at each stage can be either soft-decision or hard-decision, maximum likelihood or bounded-distance. Error performance of codes is analyzed for a memoryless additive channel based on various types of multi-stage decoding, and upper bounds on the probability of an incorrect decoding are derived. Based on our study and computation results, we find that, if component codes of a multi-level modulation code and types of decoding at various stages are chosen properly, high spectral efficiency and large coding gain can be achieved with reduced decoding complexity. In particular, we find that the difference in performance between the suboptimum multi-stage soft-decision maximum likelihood decoding of a modulation code and the single-stage optimum decoding of the overall code is very small: only a fraction of dB loss in SNR at the probability of an incorrect decoding for a block of 10(exp -6). Multi-stage decoding of multi-level modulation codes really offers a way to achieve the best of three worlds, bandwidth efficiency, coding gain, and decoding complexity.
Sh ble and Cre adapted for functional genomics and metabolic engineering of Pichia stipitis
Jose M. Laplaza; Beatriz Rivas Torres; Yong-Su Jin; Thomas W. Jeffries
2006-01-01
Pichia stipitis is widely studied for its capacity to ferment d-xylose to ethanol. Strain improvement has been facilitated by recent completion of the P. stipitis genome. P. stipitis uses CUG to code for serine rather than leucine, as is the case for the universal genetic code thereby limiting the availability of heterologous drug resistance markers for transformation...
Structured Low-Density Parity-Check Codes with Bandwidth Efficient Modulation
NASA Technical Reports Server (NTRS)
Cheng, Michael K.; Divsalar, Dariush; Duy, Stephanie
2009-01-01
In this work, we study the performance of structured Low-Density Parity-Check (LDPC) Codes together with bandwidth efficient modulations. We consider protograph-based LDPC codes that facilitate high-speed hardware implementations and have minimum distances that grow linearly with block sizes. We cover various higher- order modulations such as 8-PSK, 16-APSK, and 16-QAM. During demodulation, a demapper transforms the received in-phase and quadrature samples into reliability information that feeds the binary LDPC decoder. We will compare various low-complexity demappers and provide simulation results for assorted coded-modulation combinations on the additive white Gaussian noise and independent Rayleigh fading channels.
NASA Technical Reports Server (NTRS)
Bidwell, Colin S.; Pinella, David; Garrison, Peter
1999-01-01
Collection efficiency and ice accretion calculations were made for a commercial transport using the NASA Lewis LEWICE3D ice accretion code, the ICEGRID3D grid code and the CMARC panel code. All of the calculations were made on a Windows 95 based personal computer. The ice accretion calculations were made for the nose, wing, horizontal tail and vertical tail surfaces. Ice shapes typifying those of a 30 minute hold were generated. Collection efficiencies were also generated for the entire aircraft using the newly developed unstructured collection efficiency method. The calculations highlight the flexibility and cost effectiveness of the LEWICE3D, ICEGRID3D, CMARC combination.
Cross-indexing of binary SIFT codes for large-scale image search.
Liu, Zhen; Li, Houqiang; Zhang, Liyan; Zhou, Wengang; Tian, Qi
2014-05-01
In recent years, there has been growing interest in mapping visual features into compact binary codes for applications on large-scale image collections. Encoding high-dimensional data as compact binary codes reduces the memory cost for storage. Besides, it benefits the computational efficiency since the computation of similarity can be efficiently measured by Hamming distance. In this paper, we propose a novel flexible scale invariant feature transform (SIFT) binarization (FSB) algorithm for large-scale image search. The FSB algorithm explores the magnitude patterns of SIFT descriptor. It is unsupervised and the generated binary codes are demonstrated to be dispreserving. Besides, we propose a new searching strategy to find target features based on the cross-indexing in the binary SIFT space and original SIFT space. We evaluate our approach on two publicly released data sets. The experiments on large-scale partial duplicate image retrieval system demonstrate the effectiveness and efficiency of the proposed algorithm.
Efficient convolutional sparse coding
Wohlberg, Brendt
2017-06-20
Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.
Darbani, Behrooz; Noeparvar, Shahin; Borg, Søren
2016-01-01
RNA circularization made by head-to-tail back-splicing events is involved in the regulation of gene expression from transcriptional to post-translational levels. By exploiting RNA-Seq data and down-stream analysis, we shed light on the importance of circular RNAs in plants. The results introduce circular RNAs as novel interactors in the regulation of gene expression in plants and imply the comprehensiveness of this regulatory pathway by identifying circular RNAs for a diverse set of genes. These genes are involved in several aspects of cellular metabolism as hormonal signaling, intracellular protein sorting, carbohydrate metabolism and cell-wall biogenesis, respiration, amino acid biosynthesis, transcription and translation, and protein ubiquitination. Additionally, these parental loci of circular RNAs, from both nuclear and mitochondrial genomes, encode for different transcript classes including protein coding transcripts, microRNA, rRNA, and long non-coding/microprotein coding RNAs. The results shed light on the mitochondrial exonic circular RNAs and imply the importance of circular RNAs for regulation of mitochondrial genes. Importantly, we introduce circular RNAs in barley and elucidate their cellular-level alterations across tissues and in response to micronutrients iron and zinc. In further support of circular RNAs' functional roles in plants, we report several cases where fluctuations of circRNAs do not correlate with the levels of their parental-loci encoded linear transcripts. PMID:27375638
Dendritic Properties Control Energy Efficiency of Action Potentials in Cortical Pyramidal Cells
Yi, Guosheng; Wang, Jiang; Wei, Xile; Deng, Bin
2017-01-01
Neural computation is performed by transforming input signals into sequences of action potentials (APs), which is metabolically expensive and limited by the energy available to the brain. The metabolic efficiency of single AP has important consequences for the computational power of the cell, which is determined by its biophysical properties and morphologies. Here we adopt biophysically-based two-compartment models to investigate how dendrites affect energy efficiency of APs in cortical pyramidal neurons. We measure the Na+ entry during the spike and examine how it is efficiently used for generating AP depolarization. We show that increasing the proportion of dendritic area or coupling conductance between two chambers decreases Na+ entry efficiency of somatic AP. Activating inward Ca2+ current in dendrites results in dendritic spike, which increases AP efficiency. Activating Ca2+-activated outward K+ current in dendrites, however, decreases Na+ entry efficiency. We demonstrate that the active and passive dendrites take effects by altering the overlap between Na+ influx and internal current flowing from soma to dendrite. We explain a fundamental link between dendritic properties and AP efficiency, which is essential to interpret how neural computation consumes metabolic energy and how biophysics and morphologies contribute to such consumption. PMID:28919852
Dendritic Properties Control Energy Efficiency of Action Potentials in Cortical Pyramidal Cells.
Yi, Guosheng; Wang, Jiang; Wei, Xile; Deng, Bin
2017-01-01
Neural computation is performed by transforming input signals into sequences of action potentials (APs), which is metabolically expensive and limited by the energy available to the brain. The metabolic efficiency of single AP has important consequences for the computational power of the cell, which is determined by its biophysical properties and morphologies. Here we adopt biophysically-based two-compartment models to investigate how dendrites affect energy efficiency of APs in cortical pyramidal neurons. We measure the Na + entry during the spike and examine how it is efficiently used for generating AP depolarization. We show that increasing the proportion of dendritic area or coupling conductance between two chambers decreases Na + entry efficiency of somatic AP. Activating inward Ca 2+ current in dendrites results in dendritic spike, which increases AP efficiency. Activating Ca 2+ -activated outward K + current in dendrites, however, decreases Na + entry efficiency. We demonstrate that the active and passive dendrites take effects by altering the overlap between Na + influx and internal current flowing from soma to dendrite. We explain a fundamental link between dendritic properties and AP efficiency, which is essential to interpret how neural computation consumes metabolic energy and how biophysics and morphologies contribute to such consumption.
Acidosis overrides oxygen deprivation to maintain mitochondrial function and cell survival
Khacho, Mireille; Tarabay, Michelle; Patten, David; Khacho, Pamela; MacLaurin, Jason G.; Guadagno, Jennifer; Bergeron, Richard; Cregan, Sean P.; Harper, Mary-Ellen; Park, David S.; Slack, Ruth S.
2014-01-01
Sustained cellular function and viability of high-energy demanding post-mitotic cells rely on the continuous supply of ATP. The utilization of mitochondrial oxidative phosphorylation for efficient ATP generation is a function of oxygen levels. As such, oxygen deprivation, in physiological or pathological settings, has profound effects on cell metabolism and survival. Here we show that mild extracellular acidosis, a physiological consequence of anaerobic metabolism, can reprogramme the mitochondrial metabolic pathway to preserve efficient ATP production regardless of oxygen levels. Acidosis initiates a rapid and reversible homeostatic programme that restructures mitochondria, by regulating mitochondrial dynamics and cristae architecture, to reconfigure mitochondrial efficiency, maintain mitochondrial function and cell survival. Preventing mitochondrial remodelling results in mitochondrial dysfunction, fragmentation and cell death. Our findings challenge the notion that oxygen availability is a key limiting factor in oxidative metabolism and brings forth the concept that mitochondrial morphology can dictate the bioenergetic status of post-mitotic cells. PMID:24686499
HEVC for high dynamic range services
NASA Astrophysics Data System (ADS)
Kim, Seung-Hwan; Zhao, Jie; Misra, Kiran; Segall, Andrew
2015-09-01
Displays capable of showing a greater range of luminance values can render content containing high dynamic range information in a way such that the viewers have a more immersive experience. This paper introduces the design aspects of a high dynamic range (HDR) system, and examines the performance of the HDR processing chain in terms of compression efficiency. Specifically it examines the relation between recently introduced Society of Motion Picture and Television Engineers (SMPTE) ST 2084 transfer function and the High Efficiency Video Coding (HEVC) standard. SMPTE ST 2084 is designed to cover the full range of an HDR signal from 0 to 10,000 nits, however in many situations the valid signal range of actual video might be smaller than SMPTE ST 2084 supported range. The above restricted signal range results in restricted range of code values for input video data and adversely impacts compression efficiency. In this paper, we propose a code value remapping method that extends the restricted range code values into the full range code values so that the existing standards such as HEVC may better compress the video content. The paper also identifies related non-normative encoder-only changes that are required for remapping method for a fair comparison with anchor. Results are presented comparing the efficiency of the current approach versus the proposed remapping method for HM-16.2.
NASA Technical Reports Server (NTRS)
Lin, Shu; Fossorier, Marc
1998-01-01
The Viterbi algorithm is indeed a very simple and efficient method of implementing the maximum likelihood decoding. However, if we take advantage of the structural properties in a trellis section, other efficient trellis-based decoding algorithms can be devised. Recently, an efficient trellis-based recursive maximum likelihood decoding (RMLD) algorithm for linear block codes has been proposed. This algorithm is more efficient than the conventional Viterbi algorithm in both computation and hardware requirements. Most importantly, the implementation of this algorithm does not require the construction of the entire code trellis, only some special one-section trellises of relatively small state and branch complexities are needed for constructing path (or branch) metric tables recursively. At the end, there is only one table which contains only the most likely code-word and its metric for a given received sequence r = (r(sub 1), r(sub 2),...,r(sub n)). This algorithm basically uses the divide and conquer strategy. Furthermore, it allows parallel/pipeline processing of received sequences to speed up decoding.
NASA Technical Reports Server (NTRS)
Radhakrishnan, K.
1984-01-01
The efficiency and accuracy of several algorithms recently developed for the efficient numerical integration of stiff ordinary differential equations are compared. The methods examined include two general-purpose codes, EPISODE and LSODE, and three codes (CHEMEQ, CREK1D, and GCKP84) developed specifically to integrate chemical kinetic rate equations. The codes are applied to two test problems drawn from combustion kinetics. The comparisons show that LSODE is the fastest code currently available for the integration of combustion kinetic rate equations. An important finding is that an interactive solution of the algebraic energy conservation equation to compute the temperature does not result in significant errors. In addition, this method is more efficient than evaluating the temperature by integrating its time derivative. Significant reductions in computational work are realized by updating the rate constants (k = at(supra N) N exp(-E/RT) only when the temperature change exceeds an amount delta T that is problem dependent. An approximate expression for the automatic evaluation of delta T is derived and is shown to result in increased efficiency.
A soft X-ray source based on a low divergence, high repetition rate ultraviolet laser
NASA Astrophysics Data System (ADS)
Crawford, E. A.; Hoffman, A. L.; Milroy, R. D.; Quimby, D. C.; Albrecht, G. F.
The CORK code is utilized to evaluate the applicability of low divergence ultraviolet lasers for efficient production of soft X-rays. The use of the axial hydrodynamic code wih one ozone radial expansion to estimate radial motion and laser energy is examined. The calculation of ionization levels of the plasma and radiation rates by employing the atomic physics and radiation model included in the CORK code is described. Computations using the hydrodynamic code to determine the effect of laser intensity, spot size, and wavelength on plasma electron temperature are provided. The X-ray conversion efficiencies of the lasers are analyzed. It is observed that for a 1 GW laser power the X-ray conversion efficiency is a function of spot size, only weakly dependent on pulse length for time scales exceeding 100 psec, and better conversion efficiencies are obtained at shorter wavelengths. It is concluded that these small lasers focused to 30 micron spot sizes and 10 to the 14th W/sq cm intensities are useful sources of 1-2 keV radiation.
Chin, Young-Wook; Seo, Nari; Kim, Jae-Han; Seo, Jin-Ho
2016-11-01
2'-Fucosyllactose (2-FL) is one of the key oligosaccharides in human milk. In the present study, the salvage guanosine 5'-diphosphate (GDP)-l-fucose biosynthetic pathway from fucose was employed in engineered Escherichia coli BL21star(DE3) for efficient production of 2-FL. Introduction of the fkp gene coding for fucokinase/GDP-l-fucose pyrophosphorylase (Fkp) from Bacteroides fragilis and the fucT2 gene encoding α-1,2-fucosyltransferase from Helicobacter pylori allows the engineered E. coli to produce 2-FL from fucose, lactose and glycerol. To enhance the lactose flux to 2-FL production, the attenuated, and deleted mutants of β-galactosidase were employed. Moreover, the 2-FL yield and productivity were further improved by deletion of the fucI-fucK gene cluster coding for fucose isomerase (FucI) and fuculose kinase (FucK). Finally, fed-batch fermentation of engineered E. coli BL21star(DE3) deleting lacZ and fucI-fucK, and expressing fkp and fucT2 resulted in 23.1 g/L of extracellular concentration of 2-FL and 0.39 g/L/h productivity. Biotechnol. Bioeng. 2016;113: 2443-2452. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Bonatto, Ana C; Couto, Gustavo H; Souza, Emanuel M; Araújo, Luiza M; Pedrosa, Fabio O; Noindorf, Lilian; Benelli, Elaine M
2007-10-01
GlnD is a bifunctional uridylyltransferase/uridylyl-removing enzyme that has a central role in the general nitrogen regulatory system NTR. In enterobacteria, GlnD uridylylates the PII proteins GlnB and GlnK under low levels of fixed nitrogen or ammonium. Under high ammonium levels, GlnD removes UMP from these proteins (deuridylylation). The PII proteins are signal transduction elements that integrate the signals of nitrogen, carbon and energy, and transduce this information to proteins involved in nitrogen metabolism. In Herbaspirillum seropedicae, an endophytic diazotroph isolated from grasses, several genes coding for proteins involved in nitrogen metabolism have been identified and cloned, including glnB, glnK and glnD. In this work, the GlnB, GlnK and GlnD proteins of H. seropedicae were overexpressed in their native forms, purified and used to reconstitute the uridylylation system in vitro. The results show that H. seropedicae GlnD uridylylates GlnB and GlnK trimers producing the forms PII (UMP)(1), PII (UMP)(2) and PII (UMP)(3), in a reaction that requires 2-oxoglutarate and ATP, and is inhibited by glutamine. The quantification of these PII forms indicates that GlnB was more efficiently uridylylated than GlnK in the system used.
Montanholi, Yuri Regis; Haas, Livia Sadocco; Swanson, Kendall Carl; Coomber, Brenda Lynn; Yamashiro, Shigeto; Miller, Stephen Paul
2017-04-26
Feed costs are a major expense in the production of beef cattle. Individual variation in the efficiency of feed utilization may be evident through feed efficiency-related phenotypes such as those related to major energetic sinks. Our objectives were to assess the relationships between feed efficiency with liver morphometry and metabolic blood profile in feedlot beef cattle. Two populations (A = 112 and B = 45) of steers were tested for feed efficiency. Blood from the 12 most (efficient) and 12 least feed inefficient (inefficient) steers from population A was sampled hourly over the circadian period. Blood plasma samples were submitted for analysis on albumin, aspartate aminotransferase, γ-glutamyl transpeptidase urea, cholesterol, creatinine, alkaline phosphatase, creatine kinase, lipase, carbon dioxide, β-hydroxybutyrate, acetate and bile acids. Liver tissue was also harvested from 24 steers that were blood sampled from population A and the 10 steers with divergent feed efficiency in each tail of population B was sampled for microscopy at slaughter. Photomicroscopy images were taken using the portal triad and central vein as landmarks. Histological quantifications included cross-sectional hepatocyte perimeter and area, hepatocyte nuclear area and nuclei area as proportion of the hepatocyte area. The least square means comparison between efficient and inefficient steers for productive performance and liver morphometry and for blood analytes data were analyzed using general linear model and mixed model procedures of SAS, respectively. No differences were observed for liver weight; however, efficient steers had larger hepatocyte (i.e. hepatocyte area at the porta triad 323.31 vs. 286.37 µm 2 ) and nuclei dimensions at portal triad and central vein regions, compared with inefficient steers. The metabolic profile indicated efficient steers had lower albumin (36.18 vs. 37.65 g/l) and cholesterol (2.62 vs. 3.05 mmol/l) and higher creatinine (118.59 vs. 110.50 mmol/l) and carbon dioxide (24.36 vs. 23.65 mmol/l) than inefficient steers. Improved feed efficiency is associated with increased metabolism by the liver (enlarged hepatocytes and no difference on organ size), muscle (higher creatinine) and whole body (higher carbon dioxide); additionally, efficient steers had reduced bloodstream pools of albumin and cholesterol. These metabolic discrepancies between feed efficient and inefficient cattle may be determinants of productive performance.
Role of microRNAs involved in plant response to nitrogen and phosphorous limiting conditions
Nguyen, Giao N.; Rothstein, Steven J.; Spangenberg, German; Kant, Surya
2015-01-01
Plant microRNAs (miRNAs) are a class of small non-coding RNAs which target and regulate the expression of genes involved in several growth, development, and metabolism processes. Recent researches have shown involvement of miRNAs in the regulation of uptake and utilization of nitrogen (N) and phosphorus (P) and more importantly for plant adaptation to N and P limitation conditions by modifications in plant growth, phenology, and architecture and production of secondary metabolites. Developing strategies that allow for the higher efficiency of using both N and P fertilizers in crop production is important for economic and environmental benefits. Improved crop varieties with better adaptation to N and P limiting conditions could be a key approach to achieve this effectively. Furthermore, understanding on the interactions between N and P uptake and use and their regulation is important for the maintenance of nutrient homeostasis in plants. This review describes the possible functions of different miRNAs and their cross-talk relevant to the plant adaptive responses to N and P limiting conditions. In addition, a comprehensive understanding of these processes at molecular level and importance of biological adaptation for improved N and P use efficiency is discussed. PMID:26322069
Role of microRNAs involved in plant response to nitrogen and phosphorous limiting conditions.
Nguyen, Giao N; Rothstein, Steven J; Spangenberg, German; Kant, Surya
2015-01-01
Plant microRNAs (miRNAs) are a class of small non-coding RNAs which target and regulate the expression of genes involved in several growth, development, and metabolism processes. Recent researches have shown involvement of miRNAs in the regulation of uptake and utilization of nitrogen (N) and phosphorus (P) and more importantly for plant adaptation to N and P limitation conditions by modifications in plant growth, phenology, and architecture and production of secondary metabolites. Developing strategies that allow for the higher efficiency of using both N and P fertilizers in crop production is important for economic and environmental benefits. Improved crop varieties with better adaptation to N and P limiting conditions could be a key approach to achieve this effectively. Furthermore, understanding on the interactions between N and P uptake and use and their regulation is important for the maintenance of nutrient homeostasis in plants. This review describes the possible functions of different miRNAs and their cross-talk relevant to the plant adaptive responses to N and P limiting conditions. In addition, a comprehensive understanding of these processes at molecular level and importance of biological adaptation for improved N and P use efficiency is discussed.
A novel type of pathogen defense-related cinnamyl alcohol dehydrogenase.
Logemann, E; Reinold, S; Somssich, I E; Hahlbrock, K
1997-08-01
We describe an aromatic alcohol dehydrogenase with properties indicating a novel type of function in the defense response of plants to pathogens. To obtain the enzyme free of contamination with possible isoforms, a parsley (Petroselinum crispum) cDNA comprising the entire coding region of the elicitor-responsive gene, ELI3, was expressed in Escherichia coli. In accord with large amino acid sequence similarities with established cinnamyl and benzyl alcohol dehydrogenases from other plants, the enzyme efficiently reduced various cinnamyl and benzyl aldehydes using NADPH as a co-substrate. Highest substrate affinities were observed for cinnamaldehyde, 4-coumaraldehyde and coniferaldehyde, whereas sinapaldehyde, one of the most efficient substrates of several previously analyzed cinnamyl alcohol dehydrogenases and a characteristic precursor molecule of angiosperm lignin, was not converted. A single form of ELI3 mRNA was strongly and rapidly induced in fungal elicitor-treated parsley cells. These results, together with earlier findings that the ELI3 gene is strongly activated both in elicitor-treated parsley cells and at fungal infection sites in parsley leaves, but not in lignifying tissue, suggest a specific role of this enzyme in pathogen defense-related phenylpropanoid metabolism.
Bagci, Enise; Heijlen, Marjolein; Vergauwen, Lucia; Hagenaars, An; Houbrechts, Anne M; Esguerra, Camila V; Blust, Ronny; Darras, Veerle M; Knapen, Dries
2015-01-01
Thyroid hormone (TH) balance is essential for vertebrate development. Deiodinase type 1 (D1) and type 2 (D2) increase and deiodinase type 3 (D3) decreases local intracellular levels of T3, the most important active TH. The role of deiodinase-mediated TH effects in early vertebrate development is only partially understood. Therefore, we investigated the role of deiodinases during early development of zebrafish until 96 hours post fertilization at the level of the transcriptome (microarray), biochemistry, morphology and physiology using morpholino (MO) knockdown. Knockdown of D1+D2 (D1D2MO) and knockdown of D3 (D3MO) both resulted in transcriptional regulation of energy metabolism and (muscle) development in abdomen and tail, together with reduced growth, impaired swim bladder inflation, reduced protein content and reduced motility. The reduced growth and impaired swim bladder inflation in D1D2MO could be due to lower levels of T3 which is known to drive growth and development. The pronounced upregulation of a large number of transcripts coding for key proteins in ATP-producing pathways in D1D2MO could reflect a compensatory response to a decreased metabolic rate, also typically linked to hypothyroidism. Compared to D1D2MO, the effects were more pronounced or more frequent in D3MO, in which hyperthyroidism is expected. More specifically, increased heart rate, delayed hatching and increased carbohydrate content were observed only in D3MO. An increase of the metabolic rate, a decrease of the metabolic efficiency and a stimulation of gluconeogenesis using amino acids as substrates may have been involved in the observed reduced protein content, growth and motility in D3MO larvae. Furthermore, expression of transcripts involved in purine metabolism coupled to vision was decreased in both knockdown conditions, suggesting that both may impair vision. This study provides new insights, not only into the role of deiodinases, but also into the importance of a correct TH balance during vertebrate embryonic development.
Bagci, Enise; Heijlen, Marjolein; Vergauwen, Lucia; Hagenaars, An; Houbrechts, Anne M.; Esguerra, Camila V.; Blust, Ronny; Darras, Veerle M.; Knapen, Dries
2015-01-01
Thyroid hormone (TH) balance is essential for vertebrate development. Deiodinase type 1 (D1) and type 2 (D2) increase and deiodinase type 3 (D3) decreases local intracellular levels of T3, the most important active TH. The role of deiodinase-mediated TH effects in early vertebrate development is only partially understood. Therefore, we investigated the role of deiodinases during early development of zebrafish until 96 hours post fertilization at the level of the transcriptome (microarray), biochemistry, morphology and physiology using morpholino (MO) knockdown. Knockdown of D1+D2 (D1D2MO) and knockdown of D3 (D3MO) both resulted in transcriptional regulation of energy metabolism and (muscle) development in abdomen and tail, together with reduced growth, impaired swim bladder inflation, reduced protein content and reduced motility. The reduced growth and impaired swim bladder inflation in D1D2MO could be due to lower levels of T3 which is known to drive growth and development. The pronounced upregulation of a large number of transcripts coding for key proteins in ATP-producing pathways in D1D2MO could reflect a compensatory response to a decreased metabolic rate, also typically linked to hypothyroidism. Compared to D1D2MO, the effects were more pronounced or more frequent in D3MO, in which hyperthyroidism is expected. More specifically, increased heart rate, delayed hatching and increased carbohydrate content were observed only in D3MO. An increase of the metabolic rate, a decrease of the metabolic efficiency and a stimulation of gluconeogenesis using amino acids as substrates may have been involved in the observed reduced protein content, growth and motility in D3MO larvae. Furthermore, expression of transcripts involved in purine metabolism coupled to vision was decreased in both knockdown conditions, suggesting that both may impair vision. This study provides new insights, not only into the role of deiodinases, but also into the importance of a correct TH balance during vertebrate embryonic development. PMID:25855985
National Cost-effectiveness of ASHRAE Standard 90.1-2010 Compared to ASHRAE Standard 90.1-2007
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thornton, Brian; Halverson, Mark A.; Myer, Michael
Pacific Northwest National Laboratory (PNNL) completed this project for the U.S. Department of Energy’s (DOE’s) Building Energy Codes Program (BECP). DOE’s BECP supports upgrading building energy codes and standards, and the states’ adoption, implementation, and enforcement of upgraded codes and standards. Building energy codes and standards set minimum requirements for energy-efficient design and construction for new and renovated buildings, and impact energy use and greenhouse gas emissions for the life of buildings. Continuous improvement of building energy efficiency is achieved by periodically upgrading energy codes and standards. Ensuring that changes in the code that may alter costs (for building components,more » initial purchase and installation, replacement, maintenance and energy) are cost-effective encourages their acceptance and implementation. ANSI/ASHRAE/IESNA Standard 90.1 is the energy standard for commercial and multi-family residential buildings over three floors.« less
Cost-effectiveness of ASHRAE Standard 90.1-2010 Compared to ASHRAE Standard 90.1-2007
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thornton, Brian A.; Halverson, Mark A.; Myer, Michael
Pacific Northwest National Laboratory (PNNL) completed this project for the U.S. Department of Energy’s (DOE’s) Building Energy Codes Program (BECP). DOE’s BECP supports upgrading building energy codes and standards, and the states’ adoption, implementation, and enforcement of upgraded codes and standards. Building energy codes and standards set minimum requirements for energy-efficient design and construction for new and renovated buildings, and impact energy use and greenhouse gas emissions for the life of buildings. Continuous improvement of building energy efficiency is achieved by periodically upgrading energy codes and standards. Ensuring that changes in the code that may alter costs (for building components,more » initial purchase and installation, replacement, maintenance and energy) are cost-effective encourages their acceptance and implementation. ANSI/ASHRAE/IESNA Standard 90.1 is the energy standard for commercial and multi-family residential buildings over three floors.« less
Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes
NASA Technical Reports Server (NTRS)
DeWitt, Kenneth; Garg Vijay; Ameri, Ali
2005-01-01
The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.
Code TESLA for Modeling and Design of High-Power High-Efficiency Klystrons
2011-03-01
CODE TESLA FOR MODELING AND DESIGN OF HIGH - POWER HIGH -EFFICIENCY KLYSTRONS * I.A. Chernyavskiy, SAIC, McLean, VA 22102, U.S.A. S.J. Cooke, B...and multiple-beam klystrons as high - power RF sources. These sources are widely used or proposed to be used in accelerators in the future. Comparison...of TESLA modelling results with experimental data for a few multiple-beam klystrons are shown. INTRODUCTION High - power and high -efficiency
Audiovisual focus of attention and its application to Ultra High Definition video compression
NASA Astrophysics Data System (ADS)
Rerabek, Martin; Nemoto, Hiromi; Lee, Jong-Seok; Ebrahimi, Touradj
2014-02-01
Using Focus of Attention (FoA) as a perceptual process in image and video compression belongs to well-known approaches to increase coding efficiency. It has been shown that foveated coding, when compression quality varies across the image according to region of interest, is more efficient than the alternative coding, when all region are compressed in a similar way. However, widespread use of such foveated compression has been prevented due to two main conflicting causes, namely, the complexity and the efficiency of algorithms for FoA detection. One way around these is to use as much information as possible from the scene. Since most video sequences have an associated audio, and moreover, in many cases there is a correlation between the audio and the visual content, audiovisual FoA can improve efficiency of the detection algorithm while remaining of low complexity. This paper discusses a simple yet efficient audiovisual FoA algorithm based on correlation of dynamics between audio and video signal components. Results of audiovisual FoA detection algorithm are subsequently taken into account for foveated coding and compression. This approach is implemented into H.265/HEVC encoder producing a bitstream which is fully compliant to any H.265/HEVC decoder. The influence of audiovisual FoA in the perceived quality of high and ultra-high definition audiovisual sequences is explored and the amount of gain in compression efficiency is analyzed.
An efficient interpolation filter VLSI architecture for HEVC standard
NASA Astrophysics Data System (ADS)
Zhou, Wei; Zhou, Xin; Lian, Xiaocong; Liu, Zhenyu; Liu, Xiaoxiang
2015-12-01
The next-generation video coding standard of High-Efficiency Video Coding (HEVC) is especially efficient for coding high-resolution video such as 8K-ultra-high-definition (UHD) video. Fractional motion estimation in HEVC presents a significant challenge in clock latency and area cost as it consumes more than 40 % of the total encoding time and thus results in high computational complexity. With aims at supporting 8K-UHD video applications, an efficient interpolation filter VLSI architecture for HEVC is proposed in this paper. Firstly, a new interpolation filter algorithm based on the 8-pixel interpolation unit is proposed in this paper. It can save 19.7 % processing time on average with acceptable coding quality degradation. Based on the proposed algorithm, an efficient interpolation filter VLSI architecture, composed of a reused data path of interpolation, an efficient memory organization, and a reconfigurable pipeline interpolation filter engine, is presented to reduce the implement hardware area and achieve high throughput. The final VLSI implementation only requires 37.2k gates in a standard 90-nm CMOS technology at an operating frequency of 240 MHz. The proposed architecture can be reused for either half-pixel interpolation or quarter-pixel interpolation, which can reduce the area cost for about 131,040 bits RAM. The processing latency of our proposed VLSI architecture can support the real-time processing of 4:2:0 format 7680 × 4320@78fps video sequences.
Shielding from space radiations
NASA Technical Reports Server (NTRS)
Chang, C. Ken; Badavi, Forooz F.; Tripathi, Ram K.
1993-01-01
This Progress Report covering the period of December 1, 1992 to June 1, 1993 presents the development of an analytical solution to the heavy ion transport equation in terms of Green's function formalism. The mathematical development results are recasted into a highly efficient computer code for space applications. The efficiency of this algorithm is accomplished by a nonperturbative technique of extending the Green's function over the solution domain. The code may also be applied to accelerator boundary conditions to allow code validation in laboratory experiments. Results from the isotopic version of the code with 59 isotopes present for a single layer target material, for the case of an iron beam projectile at 600 MeV/nucleon in water is presented. A listing of the single layer isotopic version of the code is included.
Investigating the structure preserving encryption of high efficiency video coding (HEVC)
NASA Astrophysics Data System (ADS)
Shahid, Zafar; Puech, William
2013-02-01
This paper presents a novel method for the real-time protection of new emerging High Efficiency Video Coding (HEVC) standard. Structure preserving selective encryption is being performed in CABAC entropy coding module of HEVC, which is significantly different from CABAC entropy coding of H.264/AVC. In CABAC of HEVC, exponential Golomb coding is replaced by truncated Rice (TR) up to a specific value for binarization of transform coefficients. Selective encryption is performed using AES cipher in cipher feedback mode on a plaintext of binstrings in a context aware manner. The encrypted bitstream has exactly the same bit-rate and is format complaint. Experimental evaluation and security analysis of the proposed algorithm is performed on several benchmark video sequences containing different combinations of motion, texture and objects.
Four Year-Olds Use Norm-Based Coding for Face Identity
ERIC Educational Resources Information Center
Jeffery, Linda; Read, Ainsley; Rhodes, Gillian
2013-01-01
Norm-based coding, in which faces are coded as deviations from an average face, is an efficient way of coding visual patterns that share a common structure and must be distinguished by subtle variations that define individuals. Adults and school-aged children use norm-based coding for face identity but it is not yet known if pre-school aged…
Perceptual scale expansion: an efficient angular coding strategy for locomotor space.
Durgin, Frank H; Li, Zhi
2011-08-01
Whereas most sensory information is coded on a logarithmic scale, linear expansion of a limited range may provide a more efficient coding for the angular variables important to precise motor control. In four experiments, we show that the perceived declination of gaze, like the perceived orientation of surfaces, is coded on a distorted scale. The distortion seems to arise from a nearly linear expansion of the angular range close to horizontal/straight ahead and is evident in explicit verbal and nonverbal measures (Experiments 1 and 2), as well as in implicit measures of perceived gaze direction (Experiment 4). The theory is advanced that this scale expansion (by a factor of about 1.5) may serve a functional goal of coding efficiency for angular perceptual variables. The scale expansion of perceived gaze declination is accompanied by a corresponding expansion of perceived optical slants in the same range (Experiments 3 and 4). These dual distortions can account for the explicit misperception of distance typically obtained by direct report and exocentric matching, while allowing for accurate spatial action to be understood as the result of calibration.
Perceptual Scale Expansion: An Efficient Angular Coding Strategy for Locomotor Space
Durgin, Frank H.; Li, Zhi
2011-01-01
Whereas most sensory information is coded in a logarithmic scale, linear expansion of a limited range may provide a more efficient coding for angular variables important to precise motor control. In four experiments it is shown that the perceived declination of gaze, like the perceived orientation of surfaces is coded on a distorted scale. The distortion seems to arise from a nearly linear expansion of the angular range close to horizontal/straight ahead and is evident in explicit verbal and non-verbal measures (Experiments 1 and 2) and in implicit measures of perceived gaze direction (Experiment 4). The theory is advanced that this scale expansion (by a factor of about 1.5) may serve a functional goal of coding efficiency for angular perceptual variables. The scale expansion of perceived gaze declination is accompanied by a corresponding expansion of perceived optical slants in the same range (Experiments 3 and 4). These dual distortions can account for the explicit misperception of distance typically obtained by direct report and exocentric matching while allowing accurate spatial action to be understood as the result of calibration. PMID:21594732
NASA Technical Reports Server (NTRS)
Brown, Robert B.
1999-01-01
Previous experiments have shown that space flight stimulates bacterial growth and metabolism. An explanation for these results is proposed, which may eventually lead to improved terrestrial pharmaceutical production efficiency. It is hypothesized that inertial acceleration affects bacterial growth and metabolism by altering the transport phenomena in the cells external fluid environment. It is believed that this occurs indirectly through changes in the sedimentation rate acting on the bacteria and buoyancy-driven convection acting on their excreted by-products. Experiments over a broad range of accelerations consistently supported this theory. Experiments at I g indicated that higher concentrations of excreted by products surrounding bacterial cells result in a shorter lag phase. Nineteen additional experiments simulated 0 g and 0.5 g using a clinostat, and achieved 50 g, 180 g, and 400 g using a centrifuge. These experiments showed that final cell density is inversely related to the level of acceleration. The experiments also consistently showed that acceleration affects the length of the lag phase in a non-monotonic, yet predictable, manner. Additional data indicated that E. coli metabolize glucose less efficiently at hypergravity, and more efficiently at hypogravity. A space-flight experiment was also performed. Samples on orbit had a statistically significant higher final cell density and more efficient metabolism than did ground controls. These results. which were similar to simulations of 0 g using a clinostat, support the theory that gravity only affects bacterial growth and metabolism indirectly, through changes in the bacteria's fluid environment.
NASA Astrophysics Data System (ADS)
Watanabe, Junpei; Ishikawa, Hiroaki; Arouette, Xavier; Matsumoto, Yasuaki; Miki, Norihisa
2012-06-01
In this paper, we present a vibrational Braille code display with large-displacement micro-electro-mechanical systems (MEMS) actuator arrays. Tactile receptors are more sensitive to vibrational stimuli than to static ones. Therefore, when each cell of the Braille code vibrates at optimal frequencies, subjects can recognize the codes more efficiently. We fabricated a vibrational Braille code display that used actuators consisting of piezoelectric actuators and a hydraulic displacement amplification mechanism (HDAM) as cells. The HDAM that encapsulated incompressible liquids in microchambers with two flexible polymer membranes could amplify the displacement of the MEMS actuator. We investigated the voltage required for subjects to recognize Braille codes when each cell, i.e., the large-displacement MEMS actuator, vibrated at various frequencies. Lower voltages were required at vibration frequencies higher than 50 Hz than at vibration frequencies lower than 50 Hz, which verified that the proposed vibrational Braille code display is efficient by successfully exploiting the characteristics of human tactile receptors.
Advancing metabolic engineering through systems biology of industrial microorganisms.
Dai, Zongjie; Nielsen, Jens
2015-12-01
Development of sustainable processes to produce bio-based compounds is necessary due to the severe environmental problems caused by the use of fossil resources. Metabolic engineering can facilitate the development of highly efficient cell factories to produce these compounds from renewable resources. The objective of systems biology is to gain a comprehensive and quantitative understanding of living cells and can hereby enhance our ability to characterize and predict cellular behavior. Systems biology of industrial microorganisms is therefore valuable for metabolic engineering. Here we review the application of systems biology tools for the identification of metabolic engineering targets which may lead to reduced development time for efficient cell factories. Finally, we present some perspectives of systems biology for advancing metabolic engineering further. Copyright © 2015 Elsevier Ltd. All rights reserved.
Liu, Zhongliang; Hui, Yi; Shi, Lei; Chen, Zhenyu; Xu, Xiangjie; Chi, Liankai; Fan, Beibei; Fang, Yujiang; Liu, Yang; Ma, Lin; Wang, Yiran; Xiao, Lei; Zhang, Quanbin; Jin, Guohua; Liu, Ling; Zhang, Xiaoqing
2016-09-13
Loss-of-function studies in human pluripotent stem cells (hPSCs) require efficient methodologies for lesion of genes of interest. Here, we introduce a donor-free paired gRNA-guided CRISPR/Cas9 knockout strategy (paired-KO) for efficient and rapid gene ablation in hPSCs. Through paired-KO, we succeeded in targeting all genes of interest with high biallelic targeting efficiencies. More importantly, during paired-KO, the cleaved DNA was repaired mostly through direct end joining without insertions/deletions (precise ligation), and thus makes the lesion product predictable. The paired-KO remained highly efficient for one-step targeting of multiple genes and was also efficient for targeting of microRNA, while for long non-coding RNA over 8 kb, cleavage of a short fragment of the core promoter region was sufficient to eradicate downstream gene transcription. This work suggests that the paired-KO strategy is a simple and robust system for loss-of-function studies for both coding and non-coding genes in hPSCs. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.
Li, Yang; Zhu, Junge; Wang, Jianjun; Xia, Huanzhang; Wu, Sheng
2016-01-01
The phenylacetone monooxygenase, isolated from Thermobifida fusca, mainly catalyzes Baeyer-Villiger oxidation reaction towards aromatic compounds. Met446 plays a vital role in catalytic promiscuity, based on the structure and function of phenylacetone monooxygenase. Mutation in Met446 locus can offer enzyme new catalytic feature to activate C-H bond, oxidizing indole to finally generate indigo and indirubin, but the yield was only 1.89 mg/L. In order to further improve the biosynthesis efficiency of the whole-cell catalyst, metabolic engineering was applied to change glucose metabolism pathway of Escherichia coli. Blocking glucose isomerase gene pgi led to pentose phosphate pathway instead of the glycolytic pathway to become the major metabolic pathways of glucose, which provided more cofactor NADPH needed in enzymatic oxidation of indole. Engineering the host E. coli led to synthesis of indigo and indirubin efficiency further increased to 25 mg/L. Combination of protein and metabolic engineering to design efficient whole-cell catalysts not only improves the synthesis of indigo and indirubin, but also provides a novel strategy for whole-cell catalyst development.
Overall Traveling-Wave-Tube Efficiency Improved By Optimized Multistage Depressed Collector Design
NASA Technical Reports Server (NTRS)
Vaden, Karl R.
2002-01-01
Depressed Collector Design The microwave traveling wave tube (TWT) is used widely for space communications and high-power airborne transmitting sources. One of the most important features in designing a TWT is overall efficiency. Yet, overall TWT efficiency is strongly dependent on the efficiency of the electron beam collector, particularly for high values of collector efficiency. For these reasons, the NASA Glenn Research Center developed an optimization algorithm based on simulated annealing to quickly design highly efficient multistage depressed collectors (MDC's). Simulated annealing is a strategy for solving highly nonlinear combinatorial optimization problems. Its major advantage over other methods is its ability to avoid becoming trapped in local minima. Simulated annealing is based on an analogy to statistical thermodynamics, specifically the physical process of annealing: heating a material to a temperature that permits many atomic rearrangements and then cooling it carefully and slowly, until it freezes into a strong, minimum-energy crystalline structure. This minimum energy crystal corresponds to the optimal solution of a mathematical optimization problem. The TWT used as a baseline for optimization was the 32-GHz, 10-W, helical TWT developed for the Cassini mission to Saturn. The method of collector analysis and design used was a 2-1/2-dimensional computational procedure that employs two types of codes, a large signal analysis code and an electron trajectory code. The large signal analysis code produces the spatial, energetic, and temporal distributions of the spent beam entering the MDC. An electron trajectory code uses the resultant data to perform the actual collector analysis. The MDC was optimized for maximum MDC efficiency and minimum final kinetic energy of all collected electrons (to reduce heat transfer). The preceding figure shows the geometric and electrical configuration of an optimized collector with an efficiency of 93.8 percent. The results show the improvement in collector efficiency from 89.7 to 93.8 percent, resulting in an increase of three overall efficiency points. In addition, the time to design a highly efficient MDC was reduced from a month to a few days. All work was done in-house at Glenn for the High Rate Data Delivery Program. Future plans include optimizing the MDC and TWT interaction circuit in tandem to further improve overall TWT efficiency.
Wang, Jianfeng; Nan, Zhibiao; Christensen, Michael J; Zhang, Xingxu; Tian, Pei; Zhang, Zhixin; Niu, Xueli; Gao, Peng; Chen, Tao; Ma, Lixia
2018-04-25
The systemic fungal endophyte of the grass Achnatherum inebrians, Epichloë gansuensis, has important roles in enhancing resistance to biotic and abiotic stresses. In this work, we first evaluated the effects of E. gansuensis on nitrogen metabolism, nitrogen use efficiency, and stoichiometry of A. inebrians under varying nitrogen concentrations. The results demonstrated that E. gansuensis significantly improved the growth of A. inebrians under low nitrogen conditions. The fresh and dry weights, nitrogen reductase, nitrite reductase, and glutamine synthetase activity, NO 3 - , NH 4 + , N, and P content, and also the total N accumulation, N utilization efficiency, and N uptake efficiency were all higher in leaves of A. inebrians with E. ganusensis (E+) plants than A. inebrians plants without this endophyte (E-) under low nitrogen availability. In conclusion, E. gansuensis has positive effects on improving the growth of A. inebrians under low-nitrogen conditions by modulating the enzymes of nitrogen metabolism and enhancing nitrogen use efficiency.
Chroma sampling and modulation techniques in high dynamic range video coding
NASA Astrophysics Data System (ADS)
Dai, Wei; Krishnan, Madhu; Topiwala, Pankaj
2015-09-01
High Dynamic Range and Wide Color Gamut (HDR/WCG) Video Coding is an area of intense research interest in the engineering community, for potential near-term deployment in the marketplace. HDR greatly enhances the dynamic range of video content (up to 10,000 nits), as well as broadens the chroma representation (BT.2020). The resulting content offers new challenges in its coding and transmission. The Moving Picture Experts Group (MPEG) of the International Standards Organization (ISO) is currently exploring coding efficiency and/or the functionality enhancements of the recently developed HEVC video standard for HDR and WCG content. FastVDO has developed an advanced approach to coding HDR video, based on splitting the HDR signal into a smoothed luminance (SL) signal, and an associated base signal (B). Both signals are then chroma downsampled to YFbFr 4:2:0 signals, using advanced resampling filters, and coded using the Main10 High Efficiency Video Coding (HEVC) standard, which has been developed jointly by ISO/IEC MPEG and ITU-T WP3/16 (VCEG). Our proposal offers both efficient coding, and backwards compatibility with the existing HEVC Main10 Profile. That is, an existing Main10 decoder can produce a viewable standard dynamic range video, suitable for existing screens. Subjective tests show visible improvement over the anchors. Objective tests show a sizable gain of over 25% in PSNR (RGB domain) on average, for a key set of test clips selected by the ISO/MPEG committee.
Zhang, Hong; Liu, Quanhai; Fan, Tingting; Fang, Yu; Li, Ying; Wang, Guoping
2012-03-01
The metabolism and catabolism of a novel antineoplastic (ID code JS-38),Benzamide, N-[4-(2,4-dimethoxyphenyl)-4,5-dihydro-5-oxo-1,2-dithiolo[4,3-b]pyrrol-6-yl]-3,5-bis (trifluoromethyl)-(9Cl), were investigated in Wistar rats (3 female, 3 male). LC/UV, LC/MS, LC/MS/MS, NMR and acid hydrolysis methods showed that the metabolic process of JS-38 consists of a series of acetylation and glucoronation that form a metabolic product with a unique pharmacologic property of accelerating bone-marrow cell formation, and also showed a novel metabolic pathway of being acetylated and glucuronated in series.
Corcoran, Callan C.; Grady, Cameron R.; Pisitkun, Trairak; Parulekar, Jaya
2017-01-01
The organization of the mammalian genome into gene subsets corresponding to specific functional classes has provided key tools for systems biology research. Here, we have created a web-accessible resource called the Mammalian Metabolic Enzyme Database (https://hpcwebapps.cit.nih.gov/ESBL/Database/MetabolicEnzymes/MetabolicEnzymeDatabase.html) keyed to the biochemical reactions represented on iconic metabolic pathway wall charts created in the previous century. Overall, we have mapped 1,647 genes to these pathways, representing ~7 percent of the protein-coding genome. To illustrate the use of the database, we apply it to the area of kidney physiology. In so doing, we have created an additional database (Database of Metabolic Enzymes in Kidney Tubule Segments: https://hpcwebapps.cit.nih.gov/ESBL/Database/MetabolicEnzymes/), mapping mRNA abundance measurements (mined from RNA-Seq studies) for all metabolic enzymes to each of 14 renal tubule segments. We carry out bioinformatics analysis of the enzyme expression pattern among renal tubule segments and mine various data sources to identify vasopressin-regulated metabolic enzymes in the renal collecting duct. PMID:27974320
Jang, Yu-Sin; Park, Jong Myoung; Choi, Sol; Choi, Yong Jun; Seung, Do Young; Cho, Jung Hee; Lee, Sang Yup
2012-01-01
The increasing oil price and environmental concerns caused by the use of fossil fuel have renewed our interest in utilizing biomass as a sustainable resource for the production of biofuel. It is however essential to develop high performance microbes that are capable of producing biofuels with very high efficiency in order to compete with the fossil fuel. Recently, the strategies for developing microbial strains by systems metabolic engineering, which can be considered as metabolic engineering integrated with systems biology and synthetic biology, have been developed. Systems metabolic engineering allows successful development of microbes that are capable of producing several different biofuels including bioethanol, biobutanol, alkane, and biodiesel, and even hydrogen. In this review, the approaches employed to develop efficient biofuel producers by metabolic engineering and systems metabolic engineering approaches are reviewed with relevant example cases. It is expected that systems metabolic engineering will be employed as an essential strategy for the development of microbial strains for industrial applications. Copyright © 2011 Elsevier Inc. All rights reserved.
Wang, Yunxiang; Gao, Lipu; Zhu, Benzhong; Zhu, Hongliang; Luo, Yunbo; Wang, Qing; Zuo, Jinhua
2018-08-15
Long-non-coding RNA (LncRNA) is a kind of non-coding endogenous RNA that plays essential roles in diverse biological processes and various stress responses. To identify and elucidate the intricate regulatory roles of lncRNAs in chilling injury in tomato fruit, deep sequencing and bioinformatics methods were performed here. After strict screening, a total of 1411 lncRNAs were identified. Among these lncRNAs, 239 of them were significantly differentially expressed. A large amount of target genes were identified and many of them were found to code chilling stress related proteins, including redox reaction related enzyme, important enzymes about cell wall degradation, membrane lipid peroxidation related enzymes, heat and cold shock protein, energy metabolism related enzymes, salicylic acid and abscisic acid metabolism related genes. Interestingly, 41 lncRNAs were found to be the precursor of 33 miRNAs, and 186 lncRNAs were targets of 45 miRNAs. These lncRNAs targeted by miRNAs might be potential ceRNAs. Particularly, a sophisticated regulatory model including miRNAs, lncRNAs and their targets was set up. This model revealed that some miRNAs and lncRNAs may be involved in chilling injury, which provided a new perspective of lncRNAs role. Copyright © 2018 Elsevier B.V. All rights reserved.
Tran, Lee; Hanavan, Paul D; Campbell, Latoya E; De Filippis, Elena; Lake, Douglas F; Coletta, Dawn K; Roust, Lori R; Mandarino, Lawrence J; Carroll, Chad C; Katsanos, Christos S
2016-01-01
Our previous studies show reduced abundance of the β-subunit of mitochondrial H+-ATP synthase (β-F1-ATPase) in skeletal muscle of obese individuals. The β-F1-ATPase forms the catalytic core of the ATP synthase, and it is critical for ATP production in muscle. The mechanism(s) impairing β-F1-ATPase metabolism in obesity, however, are not completely understood. First, we studied total muscle protein synthesis and the translation efficiency of β-F1-ATPase in obese (BMI, 36±1 kg/m2) and lean (BMI, 22±1 kg/m2) subjects. Both total protein synthesis (0.044±0.006 vs 0.066±0.006%·h-1) and translation efficiency of β-F1-ATPase (0.0031±0.0007 vs 0.0073±0.0004) were lower in muscle from the obese subjects when compared to the lean controls (P<0.05). We then evaluated these same responses in a primary cell culture model, and tested the specific hypothesis that circulating non-esterified fatty acids (NEFA) in obesity play a role in the responses observed in humans. The findings on total protein synthesis and translation efficiency of β-F1-ATPase in primary myotubes cultured from a lean subject, and after exposure to NEFA extracted from serum of an obese subject, were similar to those obtained in humans. Among candidate microRNAs (i.e., non-coding RNAs regulating gene expression), we identified miR-127-5p in preventing the production of β-F1-ATPase. Muscle expression of miR-127-5p negatively correlated with β-F1-ATPase protein translation efficiency in humans (r = - 0.6744; P<0.01), and could be modeled in vitro by prolonged exposure of primary myotubes derived from the lean subject to NEFA extracted from the obese subject. On the other hand, locked nucleic acid inhibitor synthesized to target miR-127-5p significantly increased β-F1-ATPase translation efficiency in myotubes (0.6±0.1 vs 1.3±0.3, in control vs exposure to 50 nM inhibitor; P<0.05). Our experiments implicate circulating NEFA in obesity in suppressing muscle protein metabolism, and establish impaired β-F1-ATPase translation as an important consequence of obesity.
egs_brachy: a versatile and fast Monte Carlo code for brachytherapy
NASA Astrophysics Data System (ADS)
Chamberland, Marc J. P.; Taylor, Randle E. P.; Rogers, D. W. O.; Thomson, Rowan M.
2016-12-01
egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm)3 voxels) and eye plaque (with (1 mm)3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.
egs_brachy: a versatile and fast Monte Carlo code for brachytherapy.
Chamberland, Marc J P; Taylor, Randle E P; Rogers, D W O; Thomson, Rowan M
2016-12-07
egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm) 3 voxels) and eye plaque (with (1 mm) 3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.
Calvello, Simone; Piccardo, Matteo; Rao, Shashank Vittal; Soncini, Alessandro
2018-03-05
We have developed and implemented a new ab initio code, Ceres (Computational Emulator of Rare Earth Systems), completely written in C++11, which is dedicated to the efficient calculation of the electronic structure and magnetic properties of the crystal field states arising from the splitting of the ground state spin-orbit multiplet in lanthanide complexes. The new code gains efficiency via an optimized implementation of a direct configurational averaged Hartree-Fock (CAHF) algorithm for the determination of 4f quasi-atomic active orbitals common to all multi-electron spin manifolds contributing to the ground spin-orbit multiplet of the lanthanide ion. The new CAHF implementation is based on quasi-Newton convergence acceleration techniques coupled to an efficient library for the direct evaluation of molecular integrals, and problem-specific density matrix guess strategies. After describing the main features of the new code, we compare its efficiency with the current state-of-the-art ab initio strategy to determine crystal field levels and properties, and show that our methodology, as implemented in Ceres, represents a more time-efficient computational strategy for the evaluation of the magnetic properties of lanthanide complexes, also allowing a full representation of non-perturbative spin-orbit coupling effects. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Meredydd; Yu, Sha; Staniszewski, Aaron
Building energy efficiency is an important strategy for reducing greenhouse gas emissions globally. In fact, 55 countries have included building energy efficiency in their Nationally Determined Contributions (NDCs) under the Paris Agreement. This research uses building energy code implementation in six cities across different continents as case studies to assess what it may take for countries to implement the ambitions of their energy efficiency goals. Specifically, we look at the cases of Bogota, Colombia; Da Nang, Vietnam; Eskisehir, Turkey; Mexico City, Mexico; Rajkot, India; and Tshwane, South Africa, all of which are “deep dive” cities under the Sustainable Energy formore » All's Building Efficiency Accelerator. The research focuses on understanding the baseline with existing gaps in implementation and coordination. The methodology used a combination of surveys on code status and interviews with stakeholders at the local and national level, as well as review of published documents. We looked at code development, implementation, and evaluation. The cities are all working to improve implementation, however, the challenges they currently face include gaps in resources, capacity, tools, and institutions to check for compliance. Better coordination between national and local governments could help improve implementation, but that coordination is not yet well established. For example, all six of the cities reported that there was little to no involvement of local stakeholders in development of the national code; only one city reported that it had access to national funding to support code implementation. More robust coordination could better link cities with capacity building and funding for compliance, and ensure that the code reflects local priorities. By understanding gaps in implementation, it can also help in designing more targeted interventions to scale up energy savings.« less
Evans, Meredydd; Yu, Sha; Staniszewski, Aaron; ...
2018-04-17
Building energy efficiency is an important strategy for reducing greenhouse gas emissions globally. In fact, 55 countries have included building energy efficiency in their Nationally Determined Contributions (NDCs) under the Paris Agreement. This research uses building energy code implementation in six cities across different continents as case studies to assess what it may take for countries to implement the ambitions of their energy efficiency goals. Specifically, we look at the cases of Bogota, Colombia; Da Nang, Vietnam; Eskisehir, Turkey; Mexico City, Mexico; Rajkot, India; and Tshwane, South Africa, all of which are “deep dive” cities under the Sustainable Energy formore » All's Building Efficiency Accelerator. The research focuses on understanding the baseline with existing gaps in implementation and coordination. The methodology used a combination of surveys on code status and interviews with stakeholders at the local and national level, as well as review of published documents. We looked at code development, implementation, and evaluation. The cities are all working to improve implementation, however, the challenges they currently face include gaps in resources, capacity, tools, and institutions to check for compliance. Better coordination between national and local governments could help improve implementation, but that coordination is not yet well established. For example, all six of the cities reported that there was little to no involvement of local stakeholders in development of the national code; only one city reported that it had access to national funding to support code implementation. More robust coordination could better link cities with capacity building and funding for compliance, and ensure that the code reflects local priorities. By understanding gaps in implementation, it can also help in designing more targeted interventions to scale up energy savings.« less
Information theoretical assessment of image gathering and coding for digital restoration
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.; John, Sarah; Reichenbach, Stephen E.
1990-01-01
The process of image-gathering, coding, and restoration is presently treated in its entirety rather than as a catenation of isolated tasks, on the basis of the relationship between the spectral information density of a transmitted signal and the restorability of images from the signal. This 'information-theoretic' assessment accounts for the information density and efficiency of the acquired signal as a function of the image-gathering system's design and radiance-field statistics, as well as for the information efficiency and data compression that are obtainable through the combination of image gathering with coding to reduce signal redundancy. It is found that high information efficiency is achievable only through minimization of image-gathering degradation as well as signal redundancy.
Advanced GF(32) nonbinary LDPC coded modulation with non-uniform 9-QAM outperforming star 8-QAM.
Liu, Tao; Lin, Changyu; Djordjevic, Ivan B
2016-06-27
In this paper, we first describe a 9-symbol non-uniform signaling scheme based on Huffman code, in which different symbols are transmitted with different probabilities. By using the Huffman procedure, prefix code is designed to approach the optimal performance. Then, we introduce an algorithm to determine the optimal signal constellation sets for our proposed non-uniform scheme with the criterion of maximizing constellation figure of merit (CFM). The proposed nonuniform polarization multiplexed signaling 9-QAM scheme has the same spectral efficiency as the conventional 8-QAM. Additionally, we propose a specially designed GF(32) nonbinary quasi-cyclic LDPC code for the coded modulation system based on the 9-QAM non-uniform scheme. Further, we study the efficiency of our proposed non-uniform 9-QAM, combined with nonbinary LDPC coding, and demonstrate by Monte Carlo simulation that the proposed GF(23) nonbinary LDPC coded 9-QAM scheme outperforms nonbinary LDPC coded uniform 8-QAM by at least 0.8dB.
CFD code evaluation for internal flow modeling
NASA Technical Reports Server (NTRS)
Chung, T. J.
1990-01-01
Research on the computational fluid dynamics (CFD) code evaluation with emphasis on supercomputing in reacting flows is discussed. Advantages of unstructured grids, multigrids, adaptive methods, improved flow solvers, vector processing, parallel processing, and reduction of memory requirements are discussed. As examples, researchers include applications of supercomputing to reacting flow Navier-Stokes equations including shock waves and turbulence and combustion instability problems associated with solid and liquid propellants. Evaluation of codes developed by other organizations are not included. Instead, the basic criteria for accuracy and efficiency have been established, and some applications on rocket combustion have been made. Research toward an ultimate goal, the most accurate and efficient CFD code, is in progress and will continue for years to come.
Aydin, Busra; Ozer, Tugba; Oner, Ebru Toksoy; Arga, Kazim Yalcin
2018-03-01
Metabolic systems engineering is being used to redirect microbial metabolism for the overproduction of chemicals of interest with the aim of transforming microbial hosts into cellular factories. In this study, a genome-based metabolic systems engineering approach was designed and performed to improve biopolymer biosynthesis capability of a moderately halophilic bacterium Halomonas smyrnensis AAD6 T producing levan, which is a fructose homopolymer with many potential uses in various industries and medicine. For this purpose, the genome-scale metabolic model for AAD6 T was used to characterize the metabolic resource allocation, specifically to design metabolic engineering strategies for engineered bacteria with enhanced levan production capability. Simulations were performed in silico to determine optimal gene knockout strategies to develop new strains with enhanced levan production capability. The majority of the gene knockout strategies emphasized the vital role of the fructose uptake mechanism, and pointed out the fructose-specific phosphotransferase system (PTS fru ) as the most promising target for further metabolic engineering studies. Therefore, the PTS fru of AAD6 T was restructured with insertional mutagenesis and triparental mating techniques to construct a novel, engineered H. smyrnensis strain, BMA14. Fermentation experiments were carried out to demonstrate the high efficiency of the mutant strain BMA14 in terms of final levan concentration, sucrose consumption rate, and sucrose conversion efficiency, when compared to the AAD6 T . The genome-based metabolic systems engineering approach presented in this study might be considered an efficient framework to redirect microbial metabolism for the overproduction of chemicals of interest, and the novel strain BMA14 might be considered a potential microbial cell factory for further studies aimed to design levan production processes with lower production costs.
Experience with a vectorized general circulation weather model on Star-100
NASA Technical Reports Server (NTRS)
Soll, D. B.; Habra, N. R.; Russell, G. L.
1977-01-01
A version of an atmospheric general circulation model was vectorized to run on a CDC STAR 100. The numerical model was coded and run in two different vector languages, CDC and LRLTRAN. A factor of 10 speed improvement over an IBM 360/95 was realized. Efficient use of the STAR machine required some redesigning of algorithms and logic. This precludes the application of vectorizing compilers on the original scalar code to achieve the same results. Vector languages permit a more natural and efficient formulation for such numerical codes.
DSP code optimization based on cache
NASA Astrophysics Data System (ADS)
Xu, Chengfa; Li, Chengcheng; Tang, Bin
2013-03-01
DSP program's running efficiency on board is often lower than which via the software simulation during the program development, which is mainly resulted from the user's improper use and incomplete understanding of the cache-based memory. This paper took the TI TMS320C6455 DSP as an example, analyzed its two-level internal cache, and summarized the methods of code optimization. Processor can achieve its best performance when using these code optimization methods. At last, a specific algorithm application in radar signal processing is proposed. Experiment result shows that these optimization are efficient.
Coevolution Theory of the Genetic Code at Age Forty: Pathway to Translation and Synthetic Life
Wong, J. Tze-Fei; Ng, Siu-Kin; Mat, Wai-Kin; Hu, Taobo; Xue, Hong
2016-01-01
The origins of the components of genetic coding are examined in the present study. Genetic information arose from replicator induction by metabolite in accordance with the metabolic expansion law. Messenger RNA and transfer RNA stemmed from a template for binding the aminoacyl-RNA synthetase ribozymes employed to synthesize peptide prosthetic groups on RNAs in the Peptidated RNA World. Coevolution of the genetic code with amino acid biosynthesis generated tRNA paralogs that identify a last universal common ancestor (LUCA) of extant life close to Methanopyrus, which in turn points to archaeal tRNA introns as the most primitive introns and the anticodon usage of Methanopyrus as an ancient mode of wobble. The prediction of the coevolution theory of the genetic code that the code should be a mutable code has led to the isolation of optional and mandatory synthetic life forms with altered protein alphabets. PMID:26999216
Probability Quantization for Multiplication-Free Binary Arithmetic Coding
NASA Technical Reports Server (NTRS)
Cheung, K. -M.
1995-01-01
A method has been developed to improve on Witten's binary arithmetic coding procedure of tracking a high value and a low value. The new method approximates the probability of the less probable symbol, which improves the worst-case coding efficiency.
Cox, Christian L; Secor, Stephen M
2007-12-01
We explored meal size and clutch (i.e., genetic) effects on the relative proportion of ingested energy that is absorbed by the gut (apparent digestive efficiency), becomes available for metabolism and growth (apparent assimilation efficiency), and is used for growth (production efficiency) for juvenile Burmese pythons (Python molurus). Sibling pythons were fed rodent meals equaling 15%, 25%, and 35% of their body mass and individuals from five different clutches were fed rodent meals equaling 25% of their body mass. For each of 11-12 consecutive feeding trials, python body mass was recorded and feces and urate of each snake was collected, dried, and weighed. Energy contents of meals (mice and rats), feces, urate, and pythons were determined using bomb calorimetry. For siblings fed three different meal sizes, growth rate increased with larger meals, but there was no significant variation among the meal sizes for any of the calculated energy efficiencies. Among the three meal sizes, apparent digestive efficiency, apparent assimilation efficiency, and production efficiency averaged 91.0%, 84.7%, and 40.7%, respectively. In contrast, each of these energy efficiencies varied significantly among the five different clutches. Among these clutches production efficiency was negatively correlated with standard metabolic rate (SMR). Clutches containing individuals with low SMR were therefore able to allocate more of ingested energy into growth.
Efficient Modeling of Laser-Plasma Accelerators with INF&RNO
NASA Astrophysics Data System (ADS)
Benedetti, C.; Schroeder, C. B.; Esarey, E.; Geddes, C. G. R.; Leemans, W. P.
2010-11-01
The numerical modeling code INF&RNO (INtegrated Fluid & paRticle simulatioN cOde, pronounced "inferno") is presented. INF&RNO is an efficient 2D cylindrical code to model the interaction of a short laser pulse with an underdense plasma. The code is based on an envelope model for the laser while either a PIC or a fluid description can be used for the plasma. The effect of the laser pulse on the plasma is modeled with the time-averaged poderomotive force. These and other features allow for a speedup of 2-4 orders of magnitude compared to standard full PIC simulations while still retaining physical fidelity. The code has been benchmarked against analytical solutions and 3D PIC simulations and here a set of validation tests together with a discussion of the performances are presented.
NASA Astrophysics Data System (ADS)
Dao, Thanh Hai
2018-01-01
Network coding techniques are seen as the new dimension to improve the network performances thanks to the capability of utilizing network resources more efficiently. Indeed, the application of network coding to the realm of failure recovery in optical networks has been marking a major departure from traditional protection schemes as it could potentially achieve both rapid recovery and capacity improvement, challenging the prevailing wisdom of trading capacity efficiency for speed recovery and vice versa. In this context, the maturing of all-optical XOR technologies appears as a good match to the necessity of a more efficient protection in transparent optical networks. In addressing this opportunity, we propose to use a practical all-optical XOR network coding to leverage the conventional 1 + 1 optical path protection in transparent WDM optical networks. The network coding-assisted protection solution combines protection flows of two demands sharing the same destination node in supportive conditions, paving the way for reducing the backup capacity. A novel mathematical model taking into account the operation of new protection scheme for optimal network designs is formulated as the integer linear programming. Numerical results based on extensive simulations on realistic topologies, COST239 and NSFNET networks, are presented to highlight the benefits of our proposal compared to the conventional approach in terms of wavelength resources efficiency and network throughput.
A new code for the design and analysis of the heliostat field layout for power tower system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Xiudong; Lu, Zhenwu; Yu, Weixing
2010-04-15
A new code for the design and analysis of the heliostat field layout for power tower system is developed. In the new code, a new method for the heliostat field layout is proposed based on the edge ray principle of nonimaging optics. The heliostat field boundary is constrained by the tower height, the receiver tilt angle and size and the heliostat efficiency factor which is the product of the annual cosine efficiency and the annual atmospheric transmission efficiency. With the new method, the heliostat can be placed with a higher efficiency and a faster response speed of the design andmore » optimization can be obtained. A new module for the analysis of the aspherical heliostat is created in the new code. A new toroidal heliostat field is designed and analyzed by using the new code. Compared with the spherical heliostat, the solar image radius of the field is reduced by about 30% by using the toroidal heliostat if the mirror shape and the tracking are ideal. In addition, to maximize the utilization of land, suitable crops can be considered to be planted under heliostats. To evaluate the feasibility of the crop growth, a method for calculating the annual distribution of sunshine duration on the land surface is developed as well. (author)« less
CardioNet: a human metabolic network suited for the study of cardiomyocyte metabolism.
Karlstädt, Anja; Fliegner, Daniela; Kararigas, Georgios; Ruderisch, Hugo Sanchez; Regitz-Zagrosek, Vera; Holzhütter, Hermann-Georg
2012-08-29
Availability of oxygen and nutrients in the coronary circulation is a crucial determinant of cardiac performance. Nutrient composition of coronary blood may significantly vary in specific physiological and pathological conditions, for example, administration of special diets, long-term starvation, physical exercise or diabetes. Quantitative analysis of cardiac metabolism from a systems biology perspective may help to a better understanding of the relationship between nutrient supply and efficiency of metabolic processes required for an adequate cardiac output. Here we present CardioNet, the first large-scale reconstruction of the metabolic network of the human cardiomyocyte comprising 1793 metabolic reactions, including 560 transport processes in six compartments. We use flux-balance analysis to demonstrate the capability of the network to accomplish a set of 368 metabolic functions required for maintaining the structural and functional integrity of the cell. Taking the maintenance of ATP, biosynthesis of ceramide, cardiolipin and further important phospholipids as examples, we analyse how a changed supply of glucose, lactate, fatty acids and ketone bodies may influence the efficiency of these essential processes. CardioNet is a functionally validated metabolic network of the human cardiomyocyte that enables theorectical studies of cellular metabolic processes crucial for the accomplishment of an adequate cardiac output.
Lourdais, Olivier; Guillon, Michaël; Denardo, Dale; Blouin-Demers, Gabriel
2013-07-02
We compared thermoregulatory strategies during pregnancy in two congeneric viperid snakes (Vipera berus and Vipera aspis) with parapatric geographic ranges. V. berus is a boreal specialist with the largest known distribution among terrestrial snakes while V. aspis is a south-European species. Despite contrasted climatic affinities, the two species displayed identical thermal preferences (Tset) in a laboratory thermal gradient. Under identical natural conditions, however, V. berus was capable of maintaining Tset for longer periods, especially when the weather was constraining. Consistent with the metabolic cold adaptation hypothesis, V. berus displayed higher standard metabolic rate at all temperatures considered. We used the thermal dependence of metabolic rate to calculate daily metabolic profiles from body temperature under natural conditions. The boreal specialist experienced higher daily metabolic rate and minimized gestation duration chiefly because of differences in the metabolic reaction norms, but also superior thermoregulatory efficiency. Under cold climates, thermal constraints should make precise thermoregulation costly. However, a shift in the metabolic reaction norm may compensate for thermal constraints and modify the cost-benefit balance of thermoregulation. Covariation between metabolic rate and thermoregulation efficiency is likely an important adaptation to cold climates. Copyright © 2013 Elsevier Inc. All rights reserved.
Vongsangnak, Wanwipa; Raethong, Nachon; Mujchariyakul, Warasinee; Nguyen, Nam Ninh; Leong, Hon Wai; Laoteng, Kobkul
2017-08-30
The first genome-scale metabolic network of Cordyceps militaris (iWV1170) was constructed representing its whole metabolisms, which consisted of 894 metabolites and 1,267 metabolic reactions across five compartments, including the plasma membrane, cytoplasm, mitochondria, peroxisome and extracellular space. The iWV1170 could be exploited to explain its phenotypes of growth ability, cordycepin and other metabolites production on various substrates. A high number of genes encoding extracellular enzymes for degradation of complex carbohydrates, lipids and proteins were existed in C. militaris genome. By comparative genome-scale analysis, the adenine metabolic pathway towards putative cordycepin biosynthesis was reconstructed, indicating their evolutionary relationships across eleven species of entomopathogenic fungi. The overall metabolic routes involved in the putative cordycepin biosynthesis were also identified in C. militaris, including central carbon metabolism, amino acid metabolism (glycine, l-glutamine and l-aspartate) and nucleotide metabolism (adenosine and adenine). Interestingly, a lack of the sequence coding for ribonucleotide reductase inhibitor was observed in C. militaris that might contribute to its over-production of cordycepin. Copyright © 2017. Published by Elsevier B.V.
Efficient burst image compression using H.265/HEVC
NASA Astrophysics Data System (ADS)
Roodaki-Lavasani, Hoda; Lainema, Jani
2014-02-01
New imaging use cases are emerging as more powerful camera hardware is entering consumer markets. One family of such use cases is based on capturing multiple pictures instead of just one when taking a photograph. That kind of a camera operation allows e.g. selecting the most successful shot from a sequence of images, showing what happened right before or after the shot was taken or combining the shots by computational means to improve either visible characteristics of the picture (such as dynamic range or focus) or the artistic aspects of the photo (e.g. by superimposing pictures on top of each other). Considering that photographic images are typically of high resolution and quality and the fact that these kind of image bursts can consist of at least tens of individual pictures, an efficient compression algorithm is desired. However, traditional video coding approaches fail to provide the random access properties these use cases require to achieve near-instantaneous access to the pictures in the coded sequence. That feature is critical to allow users to browse the pictures in an arbitrary order or imaging algorithms to extract desired pictures from the sequence quickly. This paper proposes coding structures that provide such random access properties while achieving coding efficiency superior to existing image coders. The results indicate that using HEVC video codec with a single reference picture fixed for the whole sequence can achieve nearly as good compression as traditional IPPP coding structures. It is also shown that the selection of the reference frame can further improve the coding efficiency.
Optimization of Particle-in-Cell Codes on RISC Processors
NASA Technical Reports Server (NTRS)
Decyk, Viktor K.; Karmesin, Steve Roy; Boer, Aeint de; Liewer, Paulette C.
1996-01-01
General strategies are developed to optimize particle-cell-codes written in Fortran for RISC processors which are commonly used on massively parallel computers. These strategies include data reorganization to improve cache utilization and code reorganization to improve efficiency of arithmetic pipelines.
NASA Technical Reports Server (NTRS)
Weed, Richard Allen; Sankar, L. N.
1994-01-01
An increasing amount of research activity in computational fluid dynamics has been devoted to the development of efficient algorithms for parallel computing systems. The increasing performance to price ratio of engineering workstations has led to research to development procedures for implementing a parallel computing system composed of distributed workstations. This thesis proposal outlines an ongoing research program to develop efficient strategies for performing three-dimensional flow analysis on distributed computing systems. The PVM parallel programming interface was used to modify an existing three-dimensional flow solver, the TEAM code developed by Lockheed for the Air Force, to function as a parallel flow solver on clusters of workstations. Steady flow solutions were generated for three different wing and body geometries to validate the code and evaluate code performance. The proposed research will extend the parallel code development to determine the most efficient strategies for unsteady flow simulations.
Developing Discontinuous Galerkin Methods for Solving Multiphysics Problems in General Relativity
NASA Astrophysics Data System (ADS)
Kidder, Lawrence; Field, Scott; Teukolsky, Saul; Foucart, Francois; SXS Collaboration
2016-03-01
Multi-messenger observations of the merger of black hole-neutron star and neutron star-neutron star binaries, and of supernova explosions will probe fundamental physics inaccessible to terrestrial experiments. Modeling these systems requires a relativistic treatment of hydrodynamics, including magnetic fields, as well as neutrino transport and nuclear reactions. The accuracy, efficiency, and robustness of current codes that treat all of these problems is not sufficient to keep up with the observational needs. We are building a new numerical code that uses the Discontinuous Galerkin method with a task-based parallelization strategy, a promising combination that will allow multiphysics applications to be treated both accurately and efficiently on petascale and exascale machines. The code will scale to more than 100,000 cores for efficient exploration of the parameter space of potential sources and allowed physics, and the high-fidelity predictions needed to realize the promise of multi-messenger astronomy. I will discuss the current status of the development of this new code.
Krause, Sophia; Goss, Kai-Uwe
2018-05-23
Until now, the question whether slow desorption of compounds from transport proteins like the plasma protein albumin can affect hepatic uptake and thereby hepatic metabolism of these compounds has not yet been answered conclusively. This work now combines recently published experimental desorption rate constants with a liver model to address this question. For doing so, the used liver model differentiates the bound compound in blood, the unbound compound in blood and the compound within the hepatocytes as three well-stirred compartments. Our calculations show that slow desorption kinetics from albumin can indeed limit hepatic metabolism of a compound by decreasing hepatic extraction efficiency and hepatic clearance. The extent of this decrease, however, depends not only on the value of the desorption rate constant but also on how much of the compound is bound to albumin in blood and how fast intrinsic metabolism of the compound in the hepatocytes is. For strongly sorbing and sufficiently fast metabolized compounds, our calculations revealed a twentyfold lower hepatic extraction efficiency and hepatic clearance for the slowest known desorption rate constant compared to the case when instantaneous equilibrium between bound and unbound compound is assumed. The same desorption rate constant, however, has nearly no effect on hepatic extraction efficiency and hepatic clearance of weakly sorbing and slowly metabolized compounds. This work examines the relevance of desorption kinetics in various example scenarios and provides the general approach needed to quantify the effect of flow limitation, membrane permeability and desorption kinetics on hepatic metabolism at the same time.
A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics
NASA Technical Reports Server (NTRS)
Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela
2015-01-01
Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information
A domain specific language for performance portable molecular dynamics algorithms
NASA Astrophysics Data System (ADS)
Saunders, William Robert; Grant, James; Müller, Eike Hermann
2018-03-01
Developers of Molecular Dynamics (MD) codes face significant challenges when adapting existing simulation packages to new hardware. In a continuously diversifying hardware landscape it becomes increasingly difficult for scientists to be experts both in their own domain (physics/chemistry/biology) and specialists in the low level parallelisation and optimisation of their codes. To address this challenge, we describe a "Separation of Concerns" approach for the development of parallel and optimised MD codes: the science specialist writes code at a high abstraction level in a domain specific language (DSL), which is then translated into efficient computer code by a scientific programmer. In a related context, an abstraction for the solution of partial differential equations with grid based methods has recently been implemented in the (Py)OP2 library. Inspired by this approach, we develop a Python code generation system for molecular dynamics simulations on different parallel architectures, including massively parallel distributed memory systems and GPUs. We demonstrate the efficiency of the auto-generated code by studying its performance and scalability on different hardware and compare it to other state-of-the-art simulation packages. With growing data volumes the extraction of physically meaningful information from the simulation becomes increasingly challenging and requires equally efficient implementations. A particular advantage of our approach is the easy expression of such analysis algorithms. We consider two popular methods for deducing the crystalline structure of a material from the local environment of each atom, show how they can be expressed in our abstraction and implement them in the code generation framework.
Efficiency turns the table on neural encoding, decoding and noise.
Deneve, Sophie; Chalk, Matthew
2016-04-01
Sensory neurons are usually described with an encoding model, for example, a function that predicts their response from the sensory stimulus using a receptive field (RF) or a tuning curve. However, central to theories of sensory processing is the notion of 'efficient coding'. We argue here that efficient coding implies a completely different neural coding strategy. Instead of a fixed encoding model, neural populations would be described by a fixed decoding model (i.e. a model reconstructing the stimulus from the neural responses). Because the population solves a global optimization problem, individual neurons are variable, but not noisy, and have no truly invariant tuning curve or receptive field. We review recent experimental evidence and implications for neural noise correlations, robustness and adaptation. Copyright © 2016. Published by Elsevier Ltd.
Development of the 3DHZETRN code for space radiation protection
NASA Astrophysics Data System (ADS)
Wilson, John; Badavi, Francis; Slaba, Tony; Reddell, Brandon; Bahadori, Amir; Singleterry, Robert
Space radiation protection requires computationally efficient shield assessment methods that have been verified and validated. The HZETRN code is the engineering design code used for low Earth orbit dosimetric analysis and astronaut record keeping with end-to-end validation to twenty percent in Space Shuttle and International Space Station operations. HZETRN treated diffusive leakage only at the distal surface limiting its application to systems with a large radius of curvature. A revision of HZETRN that included forward and backward diffusion allowed neutron leakage to be evaluated at both the near and distal surfaces. That revision provided a deterministic code of high computational efficiency that was in substantial agreement with Monte Carlo (MC) codes in flat plates (at least to the degree that MC codes agree among themselves). In the present paper, the 3DHZETRN formalism capable of evaluation in general geometry is described. Benchmarking will help quantify uncertainty with MC codes (Geant4, FLUKA, MCNP6, and PHITS) in simple shapes such as spheres within spherical shells and boxes. Connection of the 3DHZETRN to general geometry will be discussed.
A Fast Optimization Method for General Binary Code Learning.
Shen, Fumin; Zhou, Xiang; Yang, Yang; Song, Jingkuan; Shen, Heng; Tao, Dacheng
2016-09-22
Hashing or binary code learning has been recognized to accomplish efficient near neighbor search, and has thus attracted broad interests in recent retrieval, vision and learning studies. One main challenge of learning to hash arises from the involvement of discrete variables in binary code optimization. While the widely-used continuous relaxation may achieve high learning efficiency, the pursued codes are typically less effective due to accumulated quantization error. In this work, we propose a novel binary code optimization method, dubbed Discrete Proximal Linearized Minimization (DPLM), which directly handles the discrete constraints during the learning process. Specifically, the discrete (thus nonsmooth nonconvex) problem is reformulated as minimizing the sum of a smooth loss term with a nonsmooth indicator function. The obtained problem is then efficiently solved by an iterative procedure with each iteration admitting an analytical discrete solution, which is thus shown to converge very fast. In addition, the proposed method supports a large family of empirical loss functions, which is particularly instantiated in this work by both a supervised and an unsupervised hashing losses, together with the bits uncorrelation and balance constraints. In particular, the proposed DPLM with a supervised `2 loss encodes the whole NUS-WIDE database into 64-bit binary codes within 10 seconds on a standard desktop computer. The proposed approach is extensively evaluated on several large-scale datasets and the generated binary codes are shown to achieve very promising results on both retrieval and classification tasks.
Automated Diagnosis Coding with Combined Text Representations.
Berndorfer, Stefan; Henriksson, Aron
2017-01-01
Automated diagnosis coding can be provided efficiently by learning predictive models from historical data; however, discriminating between thousands of codes while allowing a variable number of codes to be assigned is extremely difficult. Here, we explore various text representations and classification models for assigning ICD-9 codes to discharge summaries in MIMIC-III. It is shown that the relative effectiveness of the investigated representations depends on the frequency of the diagnosis code under consideration and that the best performance is obtained by combining models built using different representations.
City Reach Code Technical Support Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Athalye, Rahul A.; Chen, Yan; Zhang, Jian
This report describes and analyzes a set of energy efficiency measures that will save 20% energy over ASHRAE Standard 90.1-2013. The measures will be used to formulate a Reach Code for cities aiming to go beyond national model energy codes. A coalition of U.S. cities together with other stakeholders wanted to facilitate the development of voluntary guidelines and standards that can be implemented in stages at the city level to improve building energy efficiency. The coalition's efforts are being supported by the U.S. Department of Energy via Pacific Northwest National Laboratory (PNNL) and in collaboration with the New Buildings Institute.
Statistical physics inspired energy-efficient coded-modulation for optical communications.
Djordjevic, Ivan B; Xu, Lei; Wang, Ting
2012-04-15
Because Shannon's entropy can be obtained by Stirling's approximation of thermodynamics entropy, the statistical physics energy minimization methods are directly applicable to the signal constellation design. We demonstrate that statistical physics inspired energy-efficient (EE) signal constellation designs, in combination with large-girth low-density parity-check (LDPC) codes, significantly outperform conventional LDPC-coded polarization-division multiplexed quadrature amplitude modulation schemes. We also describe an EE signal constellation design algorithm. Finally, we propose the discrete-time implementation of D-dimensional transceiver and corresponding EE polarization-division multiplexed system. © 2012 Optical Society of America
Efficient Type Representation in TAL
NASA Technical Reports Server (NTRS)
Chen, Juan
2009-01-01
Certifying compilers generate proofs for low-level code that guarantee safety properties of the code. Type information is an essential part of safety proofs. But the size of type information remains a concern for certifying compilers in practice. This paper demonstrates type representation techniques in a large-scale compiler that achieves both concise type information and efficient type checking. In our 200,000-line certifying compiler, the size of type information is about 36% of the size of pure code and data for our benchmarks, the best result to the best of our knowledge. The type checking time is about 2% of the compilation time.
Network Coding on Heterogeneous Multi-Core Processors for Wireless Sensor Networks
Kim, Deokho; Park, Karam; Ro, Won W.
2011-01-01
While network coding is well known for its efficiency and usefulness in wireless sensor networks, the excessive costs associated with decoding computation and complexity still hinder its adoption into practical use. On the other hand, high-performance microprocessors with heterogeneous multi-cores would be used as processing nodes of the wireless sensor networks in the near future. To this end, this paper introduces an efficient network coding algorithm developed for the heterogenous multi-core processors. The proposed idea is fully tested on one of the currently available heterogeneous multi-core processors referred to as the Cell Broadband Engine. PMID:22164053
Efficient biprediction decision scheme for fast high efficiency video coding encoding
NASA Astrophysics Data System (ADS)
Park, Sang-hyo; Lee, Seung-ho; Jang, Euee S.; Jun, Dongsan; Kang, Jung-Won
2016-11-01
An efficient biprediction decision scheme of high efficiency video coding (HEVC) is proposed for fast-encoding applications. For low-delay video applications, bidirectional prediction can be used to increase compression performance efficiently with previous reference frames. However, at the same time, the computational complexity of the HEVC encoder is significantly increased due to the additional biprediction search. Although a some research has attempted to reduce this complexity, whether the prediction is strongly related to both motion complexity and prediction modes in a coding unit has not yet been investigated. A method that avoids most compression-inefficient search points is proposed so that the computational complexity of the motion estimation process can be dramatically decreased. To determine if biprediction is critical, the proposed method exploits the stochastic correlation of the context of prediction units (PUs): the direction of a PU and the accuracy of a motion vector. Through experimental results, the proposed method showed that the time complexity of biprediction can be reduced to 30% on average, outperforming existing methods in view of encoding time, number of function calls, and memory access.
Phenotypic constraints promote latent versatility and carbon efficiency in metabolic networks.
Bardoscia, Marco; Marsili, Matteo; Samal, Areejit
2015-07-01
System-level properties of metabolic networks may be the direct product of natural selection or arise as a by-product of selection on other properties. Here we study the effect of direct selective pressure for growth or viability in particular environments on two properties of metabolic networks: latent versatility to function in additional environments and carbon usage efficiency. Using a Markov chain Monte Carlo (MCMC) sampling based on flux balance analysis (FBA), we sample from a known biochemical universe random viable metabolic networks that differ in the number of directly constrained environments. We find that the latent versatility of sampled metabolic networks increases with the number of directly constrained environments and with the size of the networks. We then show that the average carbon wastage of sampled metabolic networks across the constrained environments decreases with the number of directly constrained environments and with the size of the networks. Our work expands the growing body of evidence about nonadaptive origins of key functional properties of biological networks.
Fast Sparse Coding for Range Data Denoising with Sparse Ridges Constraint.
Gao, Zhi; Lao, Mingjie; Sang, Yongsheng; Wen, Fei; Ramesh, Bharath; Zhai, Ruifang
2018-05-06
Light detection and ranging (LiDAR) sensors have been widely deployed on intelligent systems such as unmanned ground vehicles (UGVs) and unmanned aerial vehicles (UAVs) to perform localization, obstacle detection, and navigation tasks. Thus, research into range data processing with competitive performance in terms of both accuracy and efficiency has attracted increasing attention. Sparse coding has revolutionized signal processing and led to state-of-the-art performance in a variety of applications. However, dictionary learning, which plays the central role in sparse coding techniques, is computationally demanding, resulting in its limited applicability in real-time systems. In this study, we propose sparse coding algorithms with a fixed pre-learned ridge dictionary to realize range data denoising via leveraging the regularity of laser range measurements in man-made environments. Experiments on both synthesized data and real data demonstrate that our method obtains accuracy comparable to that of sophisticated sparse coding methods, but with much higher computational efficiency.
From Physics Model to Results: An Optimizing Framework for Cross-Architecture Code Generation
Blazewicz, Marek; Hinder, Ian; Koppelman, David M.; ...
2013-01-01
Starting from a high-level problem description in terms of partial differential equations using abstract tensor notation, the Chemora framework discretizes, optimizes, and generates complete high performance codes for a wide range of compute architectures. Chemora extends the capabilities of Cactus, facilitating the usage of large-scale CPU/GPU systems in an efficient manner for complex applications, without low-level code tuning. Chemora achieves parallelism through MPI and multi-threading, combining OpenMP and CUDA. Optimizations include high-level code transformations, efficient loop traversal strategies, dynamically selected data and instruction cache usage strategies, and JIT compilation of GPU code tailored to the problem characteristics. The discretization ismore » based on higher-order finite differences on multi-block domains. Chemora's capabilities are demonstrated by simulations of black hole collisions. This problem provides an acid test of the framework, as the Einstein equations contain hundreds of variables and thousands of terms.« less
Efficient Modeling of Laser-Plasma Accelerators with INF and RNO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benedetti, C.; Schroeder, C. B.; Esarey, E.
2010-11-04
The numerical modeling code INF and RNO (INtegrated Fluid and paRticle simulatioN cOde, pronounced 'inferno') is presented. INF and RNO is an efficient 2D cylindrical code to model the interaction of a short laser pulse with an underdense plasma. The code is based on an envelope model for the laser while either a PIC or a fluid description can be used for the plasma. The effect of the laser pulse on the plasma is modeled with the time-averaged poderomotive force. These and other features allow for a speedup of 2-4 orders of magnitude compared to standard full PIC simulations whilemore » still retaining physical fidelity. The code has been benchmarked against analytical solutions and 3D PIC simulations and here a set of validation tests together with a discussion of the performances are presented.« less
Steffensen, Jon Lund; Dufault-Thompson, Keith; Zhang, Ying
2018-01-01
The metabolism of individual organisms and biological communities can be viewed as a network of metabolites connected to each other through chemical reactions. In metabolic networks, chemical reactions transform reactants into products, thereby transferring elements between these metabolites. Knowledge of how elements are transferred through reactant/product pairs allows for the identification of primary compound connections through a metabolic network. However, such information is not readily available and is often challenging to obtain for large reaction databases or genome-scale metabolic models. In this study, a new algorithm was developed for automatically predicting the element-transferring reactant/product pairs using the limited information available in the standard representation of metabolic networks. The algorithm demonstrated high efficiency in analyzing large datasets and provided accurate predictions when benchmarked with manually curated data. Applying the algorithm to the visualization of metabolic networks highlighted pathways of primary reactant/product connections and provided an organized view of element-transferring biochemical transformations. The algorithm was implemented as a new function in the open source software package PSAMM in the release v0.30 (https://zhanglab.github.io/psamm/).
Metabolic Shift of Escherichia coli under Salt Stress in the Presence of Glycine Betaine
Metris, A.; George, S. M.; Mulholland, F.; Carter, A. T.
2014-01-01
An important area of food safety focuses on bacterial survival and growth in unfavorable environments. In order to understand how bacteria adapt to stresses other than nutrient limitation in batch cultures, we need to develop mechanistic models of intracellular regulation and metabolism under stress. We studied the growth of Escherichia coli in minimal medium with added salt and different osmoprotectants. To characterize the metabolic efficiency with a robust parameter, we identified the optical density (OD) values at the inflection points of measured “OD versus time” growth curves and described them as a function of glucose concentration. We found that the metabolic efficiency parameter did not necessarily follow the trend of decreasing specific growth rate as the salt concentration increased. In the absence of osmoprotectant, or in the presence of proline, the metabolic efficiency decreased with increasing NaCl concentration. However, in the presence of choline or glycine betaine, it increased between 2 and 4.5% NaCl before declining at 5% NaCl and above. Microarray analysis of the transcriptional network and proteomics analysis with glycine betaine in the medium indicated that between 4.5 and 5% NaCl, the metabolism switched from aerobic to fermentative pathways and that the response to osmotic stress is similar to that for oxidative stress. We conclude that, although the growth rate appeared to decrease smoothly with increasing NaCl, the metabolic strategy of cells changed abruptly at a threshold concentration of NaCl. PMID:24858086
Martínez, R; Juncal, J; Zaldívar, C; Arenal, A; Guillén, I; Morera, V; Carrillo, O; Estrada, M; Morales, A; Estrada, M P
2000-01-07
Growth hormone (GH) has been shown to have a profound impact on fish physiology and metabolism. However, detailed studies in transgenic fish have not been conducted. We have characterized the food conversion efficiency, protein profile, and biochemical correlates of growth rate in transgenic tilapia expressing the tilapia GH cDNA under the control of human cytomegalovirus regulatory sequences. Transgenic tilapia exhibited about 3.6-fold less food consumption than nontransgenic controls (P < 0.001). The food conversion efficiency was significantly (P < 0.05) higher (290%) in transgenic tilapia (2.3 +/- 0.4) than in the control group (0.8 +/- 0.2). Efficiency of growth, synthesis retention, anabolic stimulation, and average protein synthesis were higher in transgenic than in nontransgenic tilapia. Distinctive metabolic differences were found in transgenic juvenile tilapia. We had found differences in hepatic glucose, and in agreement with previous results we observed differences in the level of enzymatic activities in target organs. We conclude that GH-transgenic juvenile tilapia show altered physiological and metabolic conditions and are biologically more efficient. Copyright 2000 Academic Press.
Improvements in the MGA Code Provide Flexibility and Better Error Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruhter, W D; Kerr, J
2005-05-26
The Multi-Group Analysis (MGA) code is widely used to determine nondestructively the relative isotopic abundances of plutonium by gamma-ray spectrometry. MGA users have expressed concern about the lack of flexibility and transparency in the code. Users often have to ask the code developers for modifications to the code to accommodate new measurement situations, such as additional peaks being present in the plutonium spectrum or expected peaks being absent. We are testing several new improvements to a prototype, general gamma-ray isotopic analysis tool with the intent of either revising or replacing the MGA code. These improvements will give the user themore » ability to modify, add, or delete the gamma- and x-ray energies and branching intensities used by the code in determining a more precise gain and in the determination of the relative detection efficiency. We have also fully integrated the determination of the relative isotopic abundances with the determination of the relative detection efficiency to provide a more accurate determination of the errors in the relative isotopic abundances. We provide details in this paper on these improvements and a comparison of results obtained with current versions of the MGA code.« less
Amino acid fermentation at the origin of the genetic code.
de Vladar, Harold P
2012-02-10
There is evidence that the genetic code was established prior to the existence of proteins, when metabolism was powered by ribozymes. Also, early proto-organisms had to rely on simple anaerobic bioenergetic processes. In this work I propose that amino acid fermentation powered metabolism in the RNA world, and that this was facilitated by proto-adapters, the precursors of the tRNAs. Amino acids were used as carbon sources rather than as catalytic or structural elements. In modern bacteria, amino acid fermentation is known as the Stickland reaction. This pathway involves two amino acids: the first undergoes oxidative deamination, and the second acts as an electron acceptor through reductive deamination. This redox reaction results in two keto acids that are employed to synthesise ATP via substrate-level phosphorylation. The Stickland reaction is the basic bioenergetic pathway of some bacteria of the genus Clostridium. Two other facts support Stickland fermentation in the RNA world. First, several Stickland amino acid pairs are synthesised in abiotic amino acid synthesis. This suggests that amino acids that could be used as an energy substrate were freely available. Second, anticodons that have complementary sequences often correspond to amino acids that form Stickland pairs. The main hypothesis of this paper is that pairs of complementary proto-adapters were assigned to Stickland amino acids pairs. There are signatures of this hypothesis in the genetic code. Furthermore, it is argued that the proto-adapters formed double strands that brought amino acid pairs into proximity to facilitate their mutual redox reaction, structurally constraining the anticodon pairs that are assigned to these amino acid pairs. Significance tests which randomise the code are performed to study the extent of the variability of the energetic (ATP) yield. Random assignments can lead to a substantial yield of ATP and maintain enough variability, thus selection can act and refine the assignments into a proto-code that optimises the energetic yield. Monte Carlo simulations are performed to evaluate the establishment of these simple proto-codes, based on amino acid substitutions and codon swapping. In all cases, donor amino acids are assigned to anticodons composed of U+G, and have low redundancy (1-2 codons), whereas acceptor amino acids are assigned to the the remaining codons. These bioenergetic and structural constraints allow for a metabolic role for amino acids before their co-option as catalyst cofactors.
Corcoran, Callan C; Grady, Cameron R; Pisitkun, Trairak; Parulekar, Jaya; Knepper, Mark A
2017-03-01
The organization of the mammalian genome into gene subsets corresponding to specific functional classes has provided key tools for systems biology research. Here, we have created a web-accessible resource called the Mammalian Metabolic Enzyme Database ( https://hpcwebapps.cit.nih.gov/ESBL/Database/MetabolicEnzymes/MetabolicEnzymeDatabase.html) keyed to the biochemical reactions represented on iconic metabolic pathway wall charts created in the previous century. Overall, we have mapped 1,647 genes to these pathways, representing ~7 percent of the protein-coding genome. To illustrate the use of the database, we apply it to the area of kidney physiology. In so doing, we have created an additional database ( Database of Metabolic Enzymes in Kidney Tubule Segments: https://hpcwebapps.cit.nih.gov/ESBL/Database/MetabolicEnzymes/), mapping mRNA abundance measurements (mined from RNA-Seq studies) for all metabolic enzymes to each of 14 renal tubule segments. We carry out bioinformatics analysis of the enzyme expression pattern among renal tubule segments and mine various data sources to identify vasopressin-regulated metabolic enzymes in the renal collecting duct. Copyright © 2017 the American Physiological Society.
Computer-assisted coding and clinical documentation: first things first.
Tully, Melinda; Carmichael, Angela
2012-10-01
Computer-assisted coding tools have the potential to drive improvements in seven areas: Transparency of coding. Productivity (generally by 20 to 25 percent for inpatient claims). Accuracy (by improving specificity of documentation). Cost containment (by reducing overtime expenses, audit fees, and denials). Compliance. Efficiency. Consistency.
Efficiency transfer using the GEANT4 code of CERN for HPGe gamma spectrometry.
Chagren, S; Tekaya, M Ben; Reguigui, N; Gharbi, F
2016-01-01
In this work we apply the GEANT4 code of CERN to calculate the peak efficiency in High Pure Germanium (HPGe) gamma spectrometry using three different procedures. The first is a direct calculation. The second corresponds to the usual case of efficiency transfer between two different configurations at constant emission energy assuming a reference point detection configuration and the third, a new procedure, consists on the transfer of the peak efficiency between two detection configurations emitting the gamma ray in different energies assuming a "virtual" reference point detection configuration. No pre-optimization of the detector geometrical characteristics was performed before the transfer to test the ability of the efficiency transfer to reduce the effect of the ignorance on their real magnitude on the quality of the transferred efficiency. The obtained and measured efficiencies were found in good agreement for the two investigated methods of efficiency transfer. The obtained agreement proves that Monte Carlo method and especially the GEANT4 code constitute an efficient tool to obtain accurate detection efficiency values. The second investigated efficiency transfer procedure is useful to calibrate the HPGe gamma detector for any emission energy value for a voluminous source using one point source detection efficiency emitting in a different energy as a reference efficiency. The calculations preformed in this work were applied to the measurement exercise of the EUROMET428 project. A measurement exercise where an evaluation of the full energy peak efficiencies in the energy range 60-2000 keV for a typical coaxial p-type HpGe detector and several types of source configuration: point sources located at various distances from the detector and a cylindrical box containing three matrices was performed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Molecular cancer classification using a meta-sample-based regularized robust coding method.
Wang, Shu-Lin; Sun, Liuchao; Fang, Jianwen
2014-01-01
Previous studies have demonstrated that machine learning based molecular cancer classification using gene expression profiling (GEP) data is promising for the clinic diagnosis and treatment of cancer. Novel classification methods with high efficiency and prediction accuracy are still needed to deal with high dimensionality and small sample size of typical GEP data. Recently the sparse representation (SR) method has been successfully applied to the cancer classification. Nevertheless, its efficiency needs to be improved when analyzing large-scale GEP data. In this paper we present the meta-sample-based regularized robust coding classification (MRRCC), a novel effective cancer classification technique that combines the idea of meta-sample-based cluster method with regularized robust coding (RRC) method. It assumes that the coding residual and the coding coefficient are respectively independent and identically distributed. Similar to meta-sample-based SR classification (MSRC), MRRCC extracts a set of meta-samples from the training samples, and then encodes a testing sample as the sparse linear combination of these meta-samples. The representation fidelity is measured by the l2-norm or l1-norm of the coding residual. Extensive experiments on publicly available GEP datasets demonstrate that the proposed method is more efficient while its prediction accuracy is equivalent to existing MSRC-based methods and better than other state-of-the-art dimension reduction based methods.
The feasibility of QR-code prescription in Taiwan.
Lin, C-H; Tsai, F-Y; Tsai, W-L; Wen, H-W; Hu, M-L
2012-12-01
An ideal Health Care Service is a service system that focuses on patients. Patients in Taiwan have the freedom to fill their prescriptions at any pharmacies contracted with National Health Insurance. Each of these pharmacies uses its own computer system. So far, there are at least ten different systems on the market in Taiwan. To transmit the prescription information from the hospital to the pharmacy accurately and efficiently presents a great issue. This study consisted of two-dimensional applications using a QR-code to capture Patient's identification and prescription information from the hospitals as well as using a webcam to read the QR-code and transfer all data to the pharmacy computer system. Two hospitals and 85 community pharmacies participated in the study. During the trial, all participant pharmacies appraised highly of the accurate transmission of the prescription information. The contents in QR-code prescriptions from Taipei area were picked up efficiently and accurately in pharmacies at Taichung area (middle Taiwan) without software system limit and area limitation. The QR-code device received a patent (No. M376844, March 2010) from Intellectual Property Office Ministry of Economic Affair, China. Our trial has proven that QR-code prescription can provide community pharmacists an efficient, accurate and inexpensive device to digitalize the prescription contents. Consequently, pharmacists can offer better quality of pharmacy service to patients. © 2012 Blackwell Publishing Ltd.
Winslow, Luke; Zwart, Jacob A.; Batt, Ryan D.; Dugan, Hilary; Woolway, R. Iestyn; Corman, Jessica; Hanson, Paul C.; Read, Jordan S.
2016-01-01
Metabolism is a fundamental process in ecosystems that crosses multiple scales of organization from individual organisms to whole ecosystems. To improve sharing and reuse of published metabolism models, we developed LakeMetabolizer, an R package for estimating lake metabolism from in situ time series of dissolved oxygen, water temperature, and, optionally, additional environmental variables. LakeMetabolizer implements 5 different metabolism models with diverse statistical underpinnings: bookkeeping, ordinary least squares, maximum likelihood, Kalman filter, and Bayesian. Each of these 5 metabolism models can be combined with 1 of 7 models for computing the coefficient of gas exchange across the air–water interface (k). LakeMetabolizer also features a variety of supporting functions that compute conversions and implement calculations commonly applied to raw data prior to estimating metabolism (e.g., oxygen saturation and optical conversion models). These tools have been organized into an R package that contains example data, example use-cases, and function documentation. The release package version is available on the Comprehensive R Archive Network (CRAN), and the full open-source GPL-licensed code is freely available for examination and extension online. With this unified, open-source, and freely available package, we hope to improve access and facilitate the application of metabolism in studies and management of lentic ecosystems.
Topics in quantum cryptography, quantum error correction, and channel simulation
NASA Astrophysics Data System (ADS)
Luo, Zhicheng
In this thesis, we mainly investigate four different topics: efficiently implementable codes for quantum key expansion [51], quantum error-correcting codes based on privacy amplification [48], private classical capacity of quantum channels [44], and classical channel simulation with quantum side information [49, 50]. For the first topic, we propose an efficiently implementable quantum key expansion protocol, capable of increasing the size of a pre-shared secret key by a constant factor. Previously, the Shor-Preskill proof [64] of the security of the Bennett-Brassard 1984 (BB84) [6] quantum key distribution protocol relied on the theoretical existence of good classical error-correcting codes with the "dual-containing" property. But the explicit and efficiently decodable construction of such codes is unknown. We show that we can lift the dual-containing constraint by employing the non-dual-containing codes with excellent performance and efficient decoding algorithms. For the second topic, we propose a construction of Calderbank-Shor-Steane (CSS) [19, 68] quantum error-correcting codes, which are originally based on pairs of mutually dual-containing classical codes, by combining a classical code with a two-universal hash function. We show, using the results of Renner and Koenig [57], that the communication rates of such codes approach the hashing bound on tensor powers of Pauli channels in the limit of large block-length. For the third topic, we prove a regularized formula for the secret key assisted capacity region of a quantum channel for transmitting private classical information. This result parallels the work of Devetak on entanglement assisted quantum communication capacity. This formula provides a new family protocol, the private father protocol, under the resource inequality framework that includes the private classical communication without the assisted secret keys as a child protocol. For the fourth topic, we study and solve the problem of classical channel simulation with quantum side information at the receiver. Our main theorem has two important corollaries: rate-distortion theory with quantum side information and common randomness distillation. Simple proofs of achievability of classical multi-terminal source coding problems can be made via a unified approach using the channel simulation theorem as building blocks. The fully quantum generalization of the problem is also conjectured with outer and inner bounds on the achievable rate pairs.
Metabolic enzyme cost explains variable trade-offs between microbial growth rate and yield
Ferris, Michael; Bruggeman, Frank J.
2018-01-01
Microbes may maximize the number of daughter cells per time or per amount of nutrients consumed. These two strategies correspond, respectively, to the use of enzyme-efficient or substrate-efficient metabolic pathways. In reality, fast growth is often associated with wasteful, yield-inefficient metabolism, and a general thermodynamic trade-off between growth rate and biomass yield has been proposed to explain this. We studied growth rate/yield trade-offs by using a novel modeling framework, Enzyme-Flux Cost Minimization (EFCM) and by assuming that the growth rate depends directly on the enzyme investment per rate of biomass production. In a comprehensive mathematical model of core metabolism in E. coli, we screened all elementary flux modes leading to cell synthesis, characterized them by the growth rates and yields they provide, and studied the shape of the resulting rate/yield Pareto front. By varying the model parameters, we found that the rate/yield trade-off is not universal, but depends on metabolic kinetics and environmental conditions. A prominent trade-off emerges under oxygen-limited growth, where yield-inefficient pathways support a 2-to-3 times higher growth rate than yield-efficient pathways. EFCM can be widely used to predict optimal metabolic states and growth rates under varying nutrient levels, perturbations of enzyme parameters, and single or multiple gene knockouts. PMID:29451895
Video streaming with SHVC to HEVC transcoding
NASA Astrophysics Data System (ADS)
Gudumasu, Srinivas; He, Yuwen; Ye, Yan; Xiu, Xiaoyu
2015-09-01
This paper proposes an efficient Scalable High efficiency Video Coding (SHVC) to High Efficiency Video Coding (HEVC) transcoder, which can reduce the transcoding complexity significantly, and provide a desired trade-off between the transcoding complexity and the transcoded video quality. To reduce the transcoding complexity, some of coding information, such as coding unit (CU) depth, prediction mode, merge mode, motion vector information, intra direction information and transform unit (TU) depth information, in the SHVC bitstream are mapped and transcoded to single layer HEVC bitstream. One major difficulty in transcoding arises when trying to reuse the motion information from SHVC bitstream since motion vectors referring to inter-layer reference (ILR) pictures cannot be reused directly in transcoding. Reusing motion information obtained from ILR pictures for those prediction units (PUs) will reduce the complexity of the SHVC transcoder greatly but a significant reduction in the quality of the picture is observed. Pictures corresponding to the intra refresh pictures in the base layer (BL) will be coded as P pictures in enhancement layer (EL) in the SHVC bitstream; and directly reusing the intra information from the BL for transcoding will not get a good coding efficiency. To solve these problems, various transcoding technologies are proposed. The proposed technologies offer different trade-offs between transcoding speed and transcoding quality. They are implemented on the basis of reference software SHM-6.0 and HM-14.0 for the two layer spatial scalability configuration. Simulations show that the proposed SHVC software transcoder reduces the transcoding complexity by up to 98-99% using low complexity transcoding mode when compared with cascaded re-encoding method. The transcoder performance at various bitrates with different transcoding modes are compared in terms of transcoding speed and transcoded video quality.
Bridging the gap between fluxomics and industrial biotechnology.
Feng, Xueyang; Page, Lawrence; Rubens, Jacob; Chircus, Lauren; Colletti, Peter; Pakrasi, Himadri B; Tang, Yinjie J
2010-01-01
Metabolic flux analysis is a vital tool used to determine the ultimate output of cellular metabolism and thus detect biotechnologically relevant bottlenecks in productivity. ¹³C-based metabolic flux analysis (¹³C-MFA) and flux balance analysis (FBA) have many potential applications in biotechnology. However, noteworthy hurdles in fluxomics study are still present. First, several technical difficulties in both ¹³C-MFA and FBA severely limit the scope of fluxomics findings and the applicability of obtained metabolic information. Second, the complexity of metabolic regulation poses a great challenge for precise prediction and analysis of metabolic networks, as there are gaps between fluxomics results and other omics studies. Third, despite identified metabolic bottlenecks or sources of host stress from product synthesis, it remains difficult to overcome inherent metabolic robustness or to efficiently import and express nonnative pathways. Fourth, product yields often decrease as the number of enzymatic steps increases. Such decrease in yield may not be caused by rate-limiting enzymes, but rather is accumulated through each enzymatic reaction. Fifth, a high-throughput fluxomics tool hasnot been developed for characterizing nonmodel microorganisms and maximizing their application in industrial biotechnology. Refining fluxomics tools and understanding these obstacles will improve our ability to engineer highly efficient metabolic pathways in microbial hosts.
Santos, José; Monteagudo, Ángel
2017-03-27
The canonical code, although prevailing in complex genomes, is not universal. It was shown the canonical genetic code superior robustness compared to random codes, but it is not clearly determined how it evolved towards its current form. The error minimization theory considers the minimization of point mutation adverse effect as the main selection factor in the evolution of the code. We have used simulated evolution in a computer to search for optimized codes, which helps to obtain information about the optimization level of the canonical code in its evolution. A genetic algorithm searches for efficient codes in a fitness landscape that corresponds with the adaptability of possible hypothetical genetic codes. The lower the effects of errors or mutations in the codon bases of a hypothetical code, the more efficient or optimal is that code. The inclusion of the fitness sharing technique in the evolutionary algorithm allows the extent to which the canonical genetic code is in an area corresponding to a deep local minimum to be easily determined, even in the high dimensional spaces considered. The analyses show that the canonical code is not in a deep local minimum and that the fitness landscape is not a multimodal fitness landscape with deep and separated peaks. Moreover, the canonical code is clearly far away from the areas of higher fitness in the landscape. Given the non-presence of deep local minima in the landscape, although the code could evolve and different forces could shape its structure, the fitness landscape nature considered in the error minimization theory does not explain why the canonical code ended its evolution in a location which is not an area of a localized deep minimum of the huge fitness landscape.
Sandbakk, Øyvind; Hegge, Ann Magdalen; Ettema, Gertjan
2013-01-01
The ability to efficiently utilize metabolic energy to produce work is a key factor for endurance performance. The present study investigated the effects of incline, performance level, and gender on the gross mechanical efficiency during roller ski skating. Thirty-one male and nineteen female elite cross-country skiers performed a 5-min submaximal session at approximately 75% of VO2peak on a 5% inclined treadmill using the G3 skating technique. Thereafter, a 5-min session on a 12% incline using the G2 skating technique was performed at a similar work rate. Gross efficiency was calculated as the external work rate against rolling friction and gravity divided by the metabolic rate using gas exchange. Performance level was determined by the amount of skating FIS points [the Federation of International Skiing (FIS) approved scoring system for ski racing] where fewer points indicate a higher performance level. Strong significant correlations between work rate and metabolic rate within both inclines and gender were revealed (r = −0.89 to 0.98 and P < 0.05 in all cases). Gross efficiency was higher at the steeper incline, both for men (17.1 ± 0.4 vs. 15.8 ± 0.5%, P < 0.05) and women (16.9 ± 0.5 vs. 15.7 ± 0.4%, P < 0.05), but without any gender differences being apparent. Significant correlations between gross efficiency and performance level were found for both inclines and genders (r = −0.65 to 0.81 and P < 0.05 in all cases). The current study demonstrated that cross-country skiers of both genders used less metabolic energy to perform the same amount of work at steeper inclines, and that the better ranked elite male and female skiers skied more efficiently. PMID:24155722
Sandbakk, Oyvind; Hegge, Ann Magdalen; Ettema, Gertjan
2013-01-01
The ability to efficiently utilize metabolic energy to produce work is a key factor for endurance performance. The present study investigated the effects of incline, performance level, and gender on the gross mechanical efficiency during roller ski skating. Thirty-one male and nineteen female elite cross-country skiers performed a 5-min submaximal session at approximately 75% of VO2peak on a 5% inclined treadmill using the G3 skating technique. Thereafter, a 5-min session on a 12% incline using the G2 skating technique was performed at a similar work rate. Gross efficiency was calculated as the external work rate against rolling friction and gravity divided by the metabolic rate using gas exchange. Performance level was determined by the amount of skating FIS points [the Federation of International Skiing (FIS) approved scoring system for ski racing] where fewer points indicate a higher performance level. Strong significant correlations between work rate and metabolic rate within both inclines and gender were revealed (r = -0.89 to 0.98 and P < 0.05 in all cases). Gross efficiency was higher at the steeper incline, both for men (17.1 ± 0.4 vs. 15.8 ± 0.5%, P < 0.05) and women (16.9 ± 0.5 vs. 15.7 ± 0.4%, P < 0.05), but without any gender differences being apparent. Significant correlations between gross efficiency and performance level were found for both inclines and genders (r = -0.65 to 0.81 and P < 0.05 in all cases). The current study demonstrated that cross-country skiers of both genders used less metabolic energy to perform the same amount of work at steeper inclines, and that the better ranked elite male and female skiers skied more efficiently.
Information theoretical assessment of digital imaging systems
NASA Technical Reports Server (NTRS)
John, Sarah; Rahman, Zia-Ur; Huck, Friedrich O.; Reichenbach, Stephen E.
1990-01-01
The end-to-end performance of image gathering, coding, and restoration as a whole is considered. This approach is based on the pivotal relationship that exists between the spectral information density of the transmitted signal and the restorability of images from this signal. The information-theoretical assessment accounts for (1) the information density and efficiency of the acquired signal as a function of the image-gathering system design and the radiance-field statistics, and (2) the improvement in information efficiency and data compression that can be gained by combining image gathering with coding to reduce the signal redundancy and irrelevancy. It is concluded that images can be restored with better quality and from fewer data as the information efficiency of the data is increased. The restoration correctly explains the image gathering and coding processes and effectively suppresses the image-display degradations.
NASA Astrophysics Data System (ADS)
Yang, YuGuang; Zhang, YuChen; Xu, Gang; Chen, XiuBo; Zhou, Yi-Hua; Shi, WeiMin
2018-03-01
Li et al. first proposed a quantum hash function (QHF) in a quantum-walk architecture. In their scheme, two two-particle interactions, i.e., I interaction and π-phase interaction are introduced and the choice of I or π-phase interactions at each iteration depends on a message bit. In this paper, we propose an efficient QHF by dense coding of coin operators in discrete-time quantum walk. Compared with existing QHFs, our protocol has the following advantages: the efficiency of the QHF can be doubled and even more; only one particle is enough and two-particle interactions are unnecessary so that quantum resources are saved. It is a clue to apply the dense coding technique to quantum cryptographic protocols, especially to the applications with restricted quantum resources.
Information theoretical assessment of digital imaging systems
NASA Astrophysics Data System (ADS)
John, Sarah; Rahman, Zia-Ur; Huck, Friedrich O.; Reichenbach, Stephen E.
1990-10-01
The end-to-end performance of image gathering, coding, and restoration as a whole is considered. This approach is based on the pivotal relationship that exists between the spectral information density of the transmitted signal and the restorability of images from this signal. The information-theoretical assessment accounts for (1) the information density and efficiency of the acquired signal as a function of the image-gathering system design and the radiance-field statistics, and (2) the improvement in information efficiency and data compression that can be gained by combining image gathering with coding to reduce the signal redundancy and irrelevancy. It is concluded that images can be restored with better quality and from fewer data as the information efficiency of the data is increased. The restoration correctly explains the image gathering and coding processes and effectively suppresses the image-display degradations.
Thermodynamics of weight loss diets.
Fine, Eugene J; Feinman, Richard D
2004-12-08
BACKGROUND: It is commonly held that "a calorie is a calorie", i.e. that diets of equal caloric content will result in identical weight change independent of macronutrient composition, and appeal is frequently made to the laws of thermodynamics. We have previously shown that thermodynamics does not support such a view and that diets of different macronutrient content may be expected to induce different changes in body mass. Low carbohydrate diets in particular have claimed a "metabolic advantage" meaning more weight loss than in isocaloric diets of higher carbohydrate content. In this review, for pedagogic clarity, we reframe the theoretical discussion to directly link thermodynamic inefficiency to weight change. The problem in outline: Is metabolic advantage theoretically possible? If so, what biochemical mechanisms might plausibly explain it? Finally, what experimental evidence exists to determine whether it does or does not occur? RESULTS: Reduced thermodynamic efficiency will result in increased weight loss. The laws of thermodynamics are silent on the existence of variable thermodynamic efficiency in metabolic processes. Therefore such variability is permitted and can be related to differences in weight lost. The existence of variable efficiency and metabolic advantage is therefore an empiric question rather than a theoretical one, confirmed by many experimental isocaloric studies, pending a properly performed meta-analysis. Mechanisms are as yet unknown, but plausible mechanisms at the metabolic level are proposed. CONCLUSIONS: Variable thermodynamic efficiency due to dietary manipulation is permitted by physical laws, is supported by much experimental data, and may be reasonably explained by plausible mechanisms.
Thermodynamics of weight loss diets
Fine, Eugene J; Feinman, Richard D
2004-01-01
Background It is commonly held that "a calorie is a calorie", i.e. that diets of equal caloric content will result in identical weight change independent of macronutrient composition, and appeal is frequently made to the laws of thermodynamics. We have previously shown that thermodynamics does not support such a view and that diets of different macronutrient content may be expected to induce different changes in body mass. Low carbohydrate diets in particular have claimed a "metabolic advantage" meaning more weight loss than in isocaloric diets of higher carbohydrate content. In this review, for pedagogic clarity, we reframe the theoretical discussion to directly link thermodynamic inefficiency to weight change. The problem in outline: Is metabolic advantage theoretically possible? If so, what biochemical mechanisms might plausibly explain it? Finally, what experimental evidence exists to determine whether it does or does not occur? Results Reduced thermodynamic efficiency will result in increased weight loss. The laws of thermodynamics are silent on the existence of variable thermodynamic efficiency in metabolic processes. Therefore such variability is permitted and can be related to differences in weight lost. The existence of variable efficiency and metabolic advantage is therefore an empiric question rather than a theoretical one, confirmed by many experimental isocaloric studies, pending a properly performed meta-analysis. Mechanisms are as yet unknown, but plausible mechanisms at the metabolic level are proposed. Conclusions Variable thermodynamic efficiency due to dietary manipulation is permitted by physical laws, is supported by much experimental data, and may be reasonably explained by plausible mechanisms. PMID:15588283
Energy metabolism in Desulfovibrio vulgaris Hildenborough: insights from transcriptome analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pereira, Patricia M.; He, Qiang; Valente, Filipa M.A.
2007-11-01
Sulphate-reducing bacteria are important players in the global sulphur and carbon cycles, with considerable economical and ecological impact. However, the process of sulphate respiration is still incompletely understood. Several mechanisms of energy conservation have been proposed, but it is unclear how the different strategies contribute to the overall process. In order to obtain a deeper insight into the energy metabolism of sulphate-reducers whole-genome microarrays were used to compare the transcriptional response of Desulfovibrio vulgaris Hildenborough grown with hydrogen/sulphate, pyruvate/sulphate, pyruvate with limiting sulphate, and lactate/thiosulphate, relative to growth in lactate/sulphate. Growth with hydrogen/sulphate showed the largest number of differentially expressedmore » genes and the largest changes in transcript levels. In this condition the most up-regulated energy metabolism genes were those coding for the periplasmic [NiFeSe]hydrogenase, followed by the Ech hydrogenase. The results also provide evidence for the involvement of formate cycling and the recently proposed ethanol pathway during growth in hydrogen. The pathway involving CO cycling is relevant during growth on lactate and pyruvate, but not during growth in hydrogen as the most down-regulated genes were those coding for the CO-induced hydrogenase. Growth on lactate/thiosulphate reveals a down-regulation of several energymetabolism genes similar to what was observed in the presence of nitrite. This study identifies the role of several proteins involved in the energy metabolism of D. vulgaris and highlights several novel genes related to this process, revealing a more complex bioenergetic metabolism than previously considered.« less
Gruenke, L D; Konopka, K; Cadieu, M; Waskell, L
1995-10-20
The complete stoichiometry of the metabolism of the cytochrome b5 (cyt b5)-requiring substrate, methoxyflurane, by purified cytochrome P-450 2B4 was compared to that of another substrate, benzphetamine, which does not require cyt b5 for its metabolism. Cyt b5 invariably improved the efficiency of product formation. That is, in the presence of cyt b5 a greater percentage of the reducing equivalents from NADPH were utilized to generate substrate metabolites, primarily at the expense of the side product, superoxide. With methoxyflurane, cyt b5 addition always resulted in an increased rate of product formation, while with benzphetamine the rate of product formation remained unchanged, increased or decreased. The apparently contradictory observations of increased reaction efficiency but decrease in total product formation for benzphetamine can be explained by a second effect of cyt b5. Under some experimental conditions cyt b5 inhibits total NADPH consumption. Whether stimulation, inhibition, or no change in product formation is observed in the presence of cyt b5 depends on the net effect of the stimulatory and inhibitory effects of cyt b5. When total NADPH consumption is inhibited by cyt b5, the rapidly metabolized, highly coupled (approximately equal to 50%) substrate, benzphetamine, undergoes a net decrease in metabolism not counterbalanced by the increase in the efficiency (2-20%) of the reaction. In contrast, in the presence of the slowly metabolized, poorly coupled (approximately equal to 0.5-3%) substrate, methoxyflurane, inhibition of total NADPH consumption by cyt b5 was never sufficient to overcome the stimulation of product formation due to an increase in efficiency of the reaction.
NASA Technical Reports Server (NTRS)
Degaudenzi, R.; Elia, C.; Viola, R.
1990-01-01
Discussed here is a new approach to code division multiple access applied to a mobile system for voice (and data) services based on Band Limited Quasi Synchronous Code Division Multiple Access (BLQS-CDMA). The system requires users to be chip synchronized to reduce the contribution of self-interference and to make use of voice activation in order to increase the satellite power efficiency. In order to achieve spectral efficiency, Nyquist chip pulse shaping is used with no detection performance impairment. The synchronization problems are solved in the forward link by distributing a master code, whereas carrier forced activation and closed loop control techniques have been adopted in the return link. System performance sensitivity to nonlinear amplification and timing/frequency synchronization errors are analyzed.
Vladimirov, N V; Likhoshvaĭ, V A; Matushkin, Iu G
2007-01-01
Gene expression is known to correlate with degree of codon bias in many unicellular organisms. However, such correlation is absent in some organisms. Recently we demonstrated that inverted complementary repeats within coding DNA sequence must be considered for proper estimation of translation efficiency, since they may form secondary structures that obstruct ribosome movement. We have developed a program for estimation of potential coding DNA sequence expression in defined unicellular organism using its genome sequence. The program computes elongation efficiency index. Computation is based on estimation of coding DNA sequence elongation efficiency, taking into account three key factors: codon bias, average number of inverted complementary repeats, and free energy of potential stem-loop structures formed by the repeats. The influence of these factors on translation is numerically estimated. An optimal proportion of these factors is computed for each organism individually. Quantitative translational characteristics of 384 unicellular organisms (351 bacteria, 28 archaea, 5 eukaryota) have been computed using their annotated genomes from NCBI GenBank. Five potential evolutionary strategies of translational optimization have been determined among studied organisms. A considerable difference of preferred translational strategies between Bacteria and Archaea has been revealed. Significant correlations between elongation efficiency index and gene expression levels have been shown for two organisms (S. cerevisiae and H. pylori) using available microarray data. The proposed method allows to estimate numerically the coding DNA sequence translation efficiency and to optimize nucleotide composition of heterologous genes in unicellular organisms. http://www.mgs.bionet.nsc.ru/mgs/programs/eei-calculator/.
Building Energy Codes: Policy Overview and Good Practices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cox, Sadie
2016-02-19
Globally, 32% of total final energy consumption is attributed to the building sector. To reduce energy consumption, energy codes set minimum energy efficiency standards for the building sector. With effective implementation, building energy codes can support energy cost savings and complementary benefits associated with electricity reliability, air quality improvement, greenhouse gas emission reduction, increased comfort, and economic and social development. This policy brief seeks to support building code policymakers and implementers in designing effective building code programs.
An efficient code for the simulation of nonhydrostatic stratified flow over obstacles
NASA Technical Reports Server (NTRS)
Pihos, G. G.; Wurtele, M. G.
1981-01-01
The physical model and computational procedure of the code is described in detail. The code is validated in tests against a variety of known analytical solutions from the literature and is also compared against actual mountain wave observations. The code will receive as initial input either mathematically idealized or discrete observational data. The form of the obstacle or mountain is arbitrary.
Hu, Lipan; Xiang, Lixia; Li, Shuting; Zou, Zhirong; Hu, Xiao-Hui
2016-04-01
Polyamines are important in protecting plants against various environmental stresses, including protection against photodamage to the photosynthetic apparatus. The molecular mechanism of this latter effect is not completely understood. Here, we have investigated the effects of salinity-alkalinity stress and spermidine (Spd) on tomato seedlings at both physiological and transcriptional levels. Salinity-alkalinity stress decreased leaf area, net photosynthetic rate, maximum net photosynthetic rate, light saturation point, apparent quantum efficiency, total chlorophyll, chlorophyll a and chlorophyll a:chlorophyll b relative to the control. The amount of D1 protein, an important component of photosystem II, was reduced compared with the control, as was the expression of psbA, which codes for D1. Expression of the chlorophyll biosynthesis gene porphobilinogen deaminase (PBGD) was reduced following salinity-alkalinity stress, whereas the expression of Chlase, which codes for chlorophyllase, was increased. These negative physiological effects of salinity-alkalinity stress were alleviated by exogenous Spd. Expression of PBGD and psbA were enhanced, whereas the expression of Chlase was reduced, when exogenous Spd was included in the stress treatment compared with when it was not. The protective effect of Spd on chlorophyll and D1 protein content during stress may maintain the photosynthetic apparatus, permitting continued photosynthesis and growth of tomato seedlings (Solanum lycopersicum cv. Jinpengchaoguan) under salinity-alkalinity stress. © 2015 Scandinavian Plant Physiology Society.
Meng, Fantao; Han, Yong; Srisai, Dollada; Belakhov, Valery; Farias, Monica; Xu, Yong; Palmiter, Richard D; Baasov, Timor; Wu, Qi
2016-03-29
Currently available inducible Cre/loxP systems, despite their considerable utility in gene manipulation, have pitfalls in certain scenarios, such as unsatisfactory recombination rates and deleterious effects on physiology and behavior. To overcome these limitations, we designed a new, inducible gene-targeting system by introducing an in-frame nonsense mutation into the coding sequence of Cre recombinase (nsCre). Mutant mRNAs transcribed from nsCre transgene can be efficiently translated into full-length, functional Cre recombinase in the presence of nonsense suppressors such as aminoglycosides. In a proof-of-concept model, GABA signaling from hypothalamic neurons expressing agouti-related peptide (AgRP) was genetically inactivated within 4 d after treatment with a synthetic aminoglycoside. Disruption of GABA synthesis in AgRP neurons in young adult mice led to a dramatic loss of body weight due to reduced food intake and elevated energy expenditure; they also manifested glucose intolerance. In contrast, older mice with genetic inactivation of GABA signaling by AgRP neurons had only transient reduction of feeding and body weight; their energy expenditure and glucose tolerance were unaffected. These results indicate that GABAergic signaling from AgRP neurons plays a key role in the control of feeding and metabolism through an age-dependent mechanism. This new genetic technique will augment current tools used to elucidate mechanisms underlying many physiological and neurological processes.
New inducible genetic method reveals critical roles of GABA in the control of feeding and metabolism
Meng, Fantao; Han, Yong; Srisai, Dollada; Belakhov, Valery; Farias, Monica; Xu, Yong; Palmiter, Richard D.; Baasov, Timor; Wu, Qi
2016-01-01
Currently available inducible Cre/loxP systems, despite their considerable utility in gene manipulation, have pitfalls in certain scenarios, such as unsatisfactory recombination rates and deleterious effects on physiology and behavior. To overcome these limitations, we designed a new, inducible gene-targeting system by introducing an in-frame nonsense mutation into the coding sequence of Cre recombinase (nsCre). Mutant mRNAs transcribed from nsCre transgene can be efficiently translated into full-length, functional Cre recombinase in the presence of nonsense suppressors such as aminoglycosides. In a proof-of-concept model, GABA signaling from hypothalamic neurons expressing agouti-related peptide (AgRP) was genetically inactivated within 4 d after treatment with a synthetic aminoglycoside. Disruption of GABA synthesis in AgRP neurons in young adult mice led to a dramatic loss of body weight due to reduced food intake and elevated energy expenditure; they also manifested glucose intolerance. In contrast, older mice with genetic inactivation of GABA signaling by AgRP neurons had only transient reduction of feeding and body weight; their energy expenditure and glucose tolerance were unaffected. These results indicate that GABAergic signaling from AgRP neurons plays a key role in the control of feeding and metabolism through an age-dependent mechanism. This new genetic technique will augment current tools used to elucidate mechanisms underlying many physiological and neurological processes. PMID:26976589
Illeghems, Koen; Pelicaen, Rudy; De Vuyst, Luc; Weckx, Stefan
2016-09-01
Acetobacter ghanensis LMG 23848(T) and Acetobacter senegalensis 108B are acetic acid bacteria that originate from a spontaneous cocoa bean heap fermentation process and that have been characterised as strains with interesting functionalities through metabolic and kinetic studies. As there is currently little genetic information available for these species, whole-genome sequencing of A. ghanensis LMG 23848(T) and A. senegalensis 108B and subsequent data analysis was performed. This approach not only revealed characteristics such as the metabolic potential and genomic architecture, but also allowed to indicate the genetic adaptations related to the cocoa bean fermentation process. Indeed, evidence was found that both species possessed the genetic ability to be involved in citrate assimilation and displayed adaptations in their respiratory chain that might improve their competitiveness during the cocoa bean fermentation process. In contrast, other properties such as the dependence on glycerol or mannitol and lactate as energy sources or a less efficient acid stress response may explain their low competitiveness. The presence of a gene coding for a proton-translocating transhydrogenase in A. ghanensis LMG 23848(T) and the genes involved in two aromatic compound degradation pathways in A. senegalensis 108B indicate that these strains have an extended functionality compared to Acetobacter species isolated from other ecosystems. Copyright © 2016 Elsevier Ltd. All rights reserved.
Singh, Balraj; Kinne, Hannah E.; Milligan, Ryan D.; Washburn, Laura J.; Olsen, Mark; Lucci, Anthony
2016-01-01
We have previously shown that only 0.01% cells survive a metabolic challenge involving lack of glutamine in culture medium of SUM149 triple-negative Inflammatory Breast Cancer cell line. These cells, designated as SUM149-MA for metabolic adaptability, are resistant to chemotherapeutic drugs, and they efficiently metastasize to multiple organs in nude mice. We hypothesized that obesity-related molecular networks, which normally help in cellular and organismal survival under metabolic challenges, may help in the survival of MA cells. The fat mass and obesity-associated protein FTO is overexpressed in MA cells. Obesity-associated cis-acting elements in non-coding region of FTO regulate the expression of IRX3 gene, thus activating obesity networks. Here we found that IRX3 protein is significantly overexpressed in MA cells (5 to 6-fold) as compared to the parental SUM149 cell line, supporting our hypothesis. We also obtained evidence that additional key regulators of energy balance such as ARID5B, IRX5, and CUX1 P200 repressor could potentially help progenitor-like TNBC cells survive in glutamine-free medium. MO-I-500, a pharmacological inhibitor of FTO, significantly (>90%) inhibited survival and/or colony formation of SUM149-MA cells as compared to untreated cells or those treated with a control compound MO-I-100. Curiously, MO-I-500 treatment also led to decreased levels of FTO and IRX3 proteins in the SUM149 cells initially surviving in glutamine-free medium as compared to MO-I-100 treatment. Interestingly, MO-I-500 treatment had a relatively little effect on cell growth of either the SUM149 or SUM149-MA cell line when added to a complete medium containing glutamine that does not pose a metabolic challenge. Importantly, once selected and cultured in glutamine-free medium, SUM149-MA cells were no longer affected by MO-I-500 even in Gln-free medium. We conclude that panresistant MA cells contain interconnected molecular networks that govern developmental status and energy balance, and genetic and epigenetic alterations that are selected during cancer evolution. PMID:27390851
The 2.5 bit/detected photon demonstration program: Phase 2 and 3 experimental results
NASA Technical Reports Server (NTRS)
Katz, J.
1982-01-01
The experimental program for laboratory demonstration of and energy efficient optical communication channel operating at a rate of 2.5 bits/detected photon is described. Results of the uncoded PPM channel performance are presented. It is indicated that the throughput efficiency can be achieved not only with a Reed-Solomon code as originally predicted, but with a less complex code as well.
NASA Technical Reports Server (NTRS)
Rice, R. F.
1974-01-01
End-to-end system considerations involving channel coding and data compression are reported which could drastically improve the efficiency in communicating pictorial information from future planetary spacecraft. In addition to presenting new and potentially significant system considerations, this report attempts to fill a need for a comprehensive tutorial which makes much of this very subject accessible to readers whose disciplines lie outside of communication theory.
Automated Discovery of Machine-Specific Code Improvements
1984-12-01
operation of the source language. Additional analysis may reveal special features of the target architecture that may be exploited to generate efficient...Additional analysis may reveal special features of the target architecture that may be exploited to generate efficient code. Such analysis is optional...incorporate knowledge of the source language, but do not refer to features of the target machine. These early phases are sometimes referred to as the
Bolger, Conor M; Bessone, Veronica; Federolf, Peter; Ettema, Gertjan; Sandbakk, Øyvind
2018-01-01
The purpose of the present study was to examine the influence of increased loading of the roller ski on metabolic cost, gross efficiency, and kinematics of roller ski skating in steep and moderate terrain, while employing two incline-specific techniques. Ten nationally ranked male cross-country skiers were subjected to four 7-minute submaximal intervals, with 0, 0.5, 1.0, and 1.5 kg added beneath the roller-ski in a randomized order. This was done on two separate days, with the G2 skating at 12% incline and 7 km/h speed and G3 skating at 5% incline and 14 km/h speed, respectively. At 12% incline, there was a significant increase in metabolic rate and a decrease in gross efficiency with added weight (P<0.001 and P = 0.002). At 5% incline, no change in metabolic rate or gross efficiency was found (P = 0.89 and P = 0.11). Rating of perceived exertion (RPE) increased gradually with added weight at both inclines (P>0.05). No changes in cycle characteristics were observed between the different ski loadings at either incline, although the lateral and vertical displacements of the foot/skis were slightly altered at 12% incline with added weight. In conclusion, the present study demonstrates that increased loading of the ski increases the metabolic cost and reduces gross efficiency during steep uphill roller skiing in G2 skating, whereas no significant effect was revealed when skating on relatively flat terrain in G3. Cycle characteristics remained unchanged across conditions at both inclines, whereas small adjustments in the displacement of the foot coincided with the efficiency changes in uphill terrain. The increased RPE values with added ski-weight at both inclines indicates that other factors than those measured here could have influenced effort and/or fatigue when lifting a heavier ski.
Correcting quantum errors with entanglement.
Brun, Todd; Devetak, Igor; Hsieh, Min-Hsiu
2006-10-20
We show how entanglement shared between encoder and decoder can simplify the theory of quantum error correction. The entanglement-assisted quantum codes we describe do not require the dual-containing constraint necessary for standard quantum error-correcting codes, thus allowing us to "quantize" all of classical linear coding theory. In particular, efficient modern classical codes that attain the Shannon capacity can be made into entanglement-assisted quantum codes attaining the hashing bound (closely related to the quantum capacity). For systems without large amounts of shared entanglement, these codes can also be used as catalytic codes, in which a small amount of initial entanglement enables quantum communication.
Country Report on Building Energy Codes in Australia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shui, Bin; Evans, Meredydd; Somasundaram, Sriram
2009-04-02
This report is part of a series of reports on building energy efficiency codes in countries associated with the Asian Pacific Partnership (APP) - Australia, South Korea, Japan, China, India, and the United States of America (U.S.). This reports gives an overview of the development of building energy codes in Australia, including national energy policies related to building energy codes, history of building energy codes, recent national projects and activities to promote building energy codes. The report also provides a review of current building energy codes (such as building envelope, HVAC, and lighting) for commercial and residential buildings in Australia.
Metabolic basis for the self-referential genetic code.
Guimarães, Romeu Cardoso
2011-08-01
An investigation of the biosynthesis pathways producing glycine and serine was necessary to clarify an apparent inconsistency between the self-referential model (SRM) for the formation of the genetic code and the model of coevolution of encodings and of amino acid biosynthesis routes. According to the SRM proposal, glycine was the first amino acid encoded, followed by serine. The coevolution model does not state precisely which the first encodings were, only presenting a list of about ten early assignments including the derivation of glycine from serine-this being derived from the glycolysis intermediate glycerate, which reverses the order proposed by the self-referential model. Our search identified the glycine-serine pathway of syntheses based on one-carbon sources, involving activities of the glycine decarboxylase complex and its associated serine hydroxymethyltransferase, which is consistent with the order proposed by the self-referential model and supports its rationale for the origin of the genetic code: protein synthesis was developed inside an early metabolic system, serving the function of a sink of amino acids; the first peptides were glycine-rich and fit for the function of building the early ribonucleoproteins; glycine consumption in proteins drove the fixation of the glycine-serine pathway.
An efficient decoding for low density parity check codes
NASA Astrophysics Data System (ADS)
Zhao, Ling; Zhang, Xiaolin; Zhu, Manjie
2009-12-01
Low density parity check (LDPC) codes are a class of forward-error-correction codes. They are among the best-known codes capable of achieving low bit error rates (BER) approaching Shannon's capacity limit. Recently, LDPC codes have been adopted by the European Digital Video Broadcasting (DVB-S2) standard, and have also been proposed for the emerging IEEE 802.16 fixed and mobile broadband wireless-access standard. The consultative committee for space data system (CCSDS) has also recommended using LDPC codes in the deep space communications and near-earth communications. It is obvious that LDPC codes will be widely used in wired and wireless communication, magnetic recording, optical networking, DVB, and other fields in the near future. Efficient hardware implementation of LDPC codes is of great interest since LDPC codes are being considered for a wide range of applications. This paper presents an efficient partially parallel decoder architecture suited for quasi-cyclic (QC) LDPC codes using Belief propagation algorithm for decoding. Algorithmic transformation and architectural level optimization are incorporated to reduce the critical path. First, analyze the check matrix of LDPC code, to find out the relationship between the row weight and the column weight. And then, the sharing level of the check node updating units (CNU) and the variable node updating units (VNU) are determined according to the relationship. After that, rearrange the CNU and the VNU, and divide them into several smaller parts, with the help of some assistant logic circuit, these smaller parts can be grouped into CNU during the check node update processing and grouped into VNU during the variable node update processing. These smaller parts are called node update kernel units (NKU) and the assistant logic circuit are called node update auxiliary unit (NAU). With NAUs' help, the two steps of iteration operation are completed by NKUs, which brings in great hardware resource reduction. Meanwhile, efficient techniques have been developed to reduce the computation delay of the node processing units and to minimize hardware overhead for parallel processing. This method may be applied not only to regular LDPC codes, but also to the irregular ones. Based on the proposed architectures, a (7493, 6096) irregular QC-LDPC code decoder is described using verilog hardware design language and implemented on Altera field programmable gate array (FPGA) StratixII EP2S130. The implementation results show that over 20% of logic core size can be saved than conventional partially parallel decoder architectures without any performance degradation. If the decoding clock is 100MHz, the proposed decoder can achieve a maximum (source data) decoding throughput of 133 Mb/s at 18 iterations.
New double-byte error-correcting codes for memory systems
NASA Technical Reports Server (NTRS)
Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.
1996-01-01
Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.
Abd_Allah, Elsayed Fathi; Nauman, Mohd; Asif, Ambreen; Hashem, Abeer; Alqarawi, Abdulaziz A.
2017-01-01
Productivity of wheat (Triticum aestivum) is markedly affected by high temperature and nitrogen deficiency. Identifying the functional proteins produced in response to these multiple stresses acting in a coordinated manner can help in developing tolerance in the crop. In this study, two wheat cultivars with contrasting nitrogen efficiencies (N-efficient VL616 and N-inefficient UP2382) were grown in control conditions, and under a combined stress of high temperature (32 °C) and low nitrogen (4 mM), and their leaf proteins were analysed in order to identify the responsive proteins. Two-dimensional electrophoresis unravelled sixty-one proteins, which varied in their expression in wheat, and were homologous to known functional proteins involved in biosynthesis, carbohydrate metabolism, energy metabolism, photosynthesis, protein folding, transcription, signalling, oxidative stress, water stress, lipid metabolism, heat stress tolerance, nitrogen metabolism, and protein synthesis. When exposed to high temperature in combination with low nitrogen, wheat plants altered their protein expression as an adaptive means to maintain growth. This response varied with cultivars. Nitrogen-efficient cultivars showed a higher potential of redox homeostasis, protein stability, osmoprotection, and regulation of nitrogen levels. The identified stress-responsive proteins can pave the way for enhancing the multiple-stress tolerance in wheat and developing a better understanding of its mechanism. PMID:29186028
Industrial metabolism of chlorine: a case study of a chlor-alkali industrial chain.
Han, Feng; Li, Wenfeng; Yu, Fei; Cui, Zhaojie
2014-05-01
Substance flow analysis (SFA) is applied to a case study of chlorine metabolism in a chlor-alkali industrial chain. A chain-level SFA model is constructed, and eight indices are proposed to analyze and evaluate the metabolic status of elemental chlorine. The primary objectives of this study are to identify low-efficiency links in production processes and to find ways to improve the operational performance of the industrial chain. Five-year in-depth data collection and analysis revealed that system production efficiency and source efficiency continued increasing since 2008, i.e., when the chain was first formed, at average annual growth rates of 21.01 % and 1.01 %, respectively. In 2011, 64.15 % of the total chlorine input was transformed into final products. That is, as high as 98.50 % of the chlorine inputs were utilized when other by-products were counted. Chlorine loss occurred mostly in the form of chloride ions in wastewater, and the system loss rate was 0.54 %. The metabolic efficiency of chlorine in this case was high, and the chain system had minimal impact on the environment. However, from the perspectives of processing depth and economic output, the case study of a chlor-alkali industrial chain still requires expansion.
Theron, Chrispian W; Berrios, Julio; Delvigne, Frank; Fickers, Patrick
2018-01-01
The methylotrophic yeast Komagataella (Pichia) pastoris has become one of the most utilized cell factories for the production of recombinant proteins over the last three decades. This success story is linked to its specific physiological traits, i.e., the ability to grow at high cell density in inexpensive culture medium and to secrete proteins at high yield. Exploiting methanol metabolism is at the core of most P. pastoris-based processes but comes with its own challenges. Co-feeding cultures with glycerol/sorbitol and methanol is a promising approach, which can benefit from improved understanding and prediction of metabolic response. The development of profitable processes relies on the construction and selection of efficient producing strains from less efficient ones but also depends on the ability to master the bioreactor process itself. More specifically, how a bioreactor processes could be monitored and controlled to obtain high yield of production. In this review, new perspectives are detailed regarding a multi-faceted approach to recombinant protein production processes by P. pastoris; including gaining improved understanding of the metabolic pathways involved, accounting for variations in transcriptional and translational efficiency at the single cell level and efficient monitoring and control of methanol levels at the bioreactor level.
Visual pattern image sequence coding
NASA Technical Reports Server (NTRS)
Silsbee, Peter; Bovik, Alan C.; Chen, Dapang
1990-01-01
The visual pattern image coding (VPIC) configurable digital image-coding process is capable of coding with visual fidelity comparable to the best available techniques, at compressions which (at 30-40:1) exceed all other technologies. These capabilities are associated with unprecedented coding efficiencies; coding and decoding operations are entirely linear with respect to image size and entail a complexity that is 1-2 orders of magnitude faster than any previous high-compression technique. The visual pattern image sequence coding to which attention is presently given exploits all the advantages of the static VPIC in the reduction of information from an additional, temporal dimension, to achieve unprecedented image sequence coding performance.
LDPC coded OFDM over the atmospheric turbulence channel.
Djordjevic, Ivan B; Vasic, Bane; Neifeld, Mark A
2007-05-14
Low-density parity-check (LDPC) coded optical orthogonal frequency division multiplexing (OFDM) is shown to significantly outperform LDPC coded on-off keying (OOK) over the atmospheric turbulence channel in terms of both coding gain and spectral efficiency. In the regime of strong turbulence at a bit-error rate of 10(-5), the coding gain improvement of the LDPC coded single-side band unclipped-OFDM system with 64 sub-carriers is larger than the coding gain of the LDPC coded OOK system by 20.2 dB for quadrature-phase-shift keying (QPSK) and by 23.4 dB for binary-phase-shift keying (BPSK).
Strand, Janne M; Scheffler, Katja; Bjørås, Magnar; Eide, Lars
2014-06-01
The cellular genomes are continuously damaged by reactive oxygen species (ROS) from aerobic processes. The impact of DNA damage depends on the specific site as well as the cellular state. The steady-state level of DNA damage is the net result of continuous formation and subsequent repair, but it is unknown to what extent heterogeneous damage distribution is caused by variations in formation or repair of DNA damage. Here, we used a restriction enzyme/qPCR based method to analyze DNA damage in promoter and coding regions of four nuclear genes: the two house-keeping genes Gadph and Tbp, and the Ndufa9 and Ndufs2 genes encoding mitochondrial complex I subunits, as well as mt-Rnr1 encoded by mitochondrial DNA (mtDNA). The distribution of steady-state levels of damage varied in a site-specific manner. Oxidative stress induced damage in nDNA to a similar extent in promoter and coding regions, and more so in mtDNA. The subsequent removal of damage from nDNA was efficient and comparable with recovery times depending on the initial damage load, while repair of mtDNA was delayed with subsequently slower repair rate. The repair was furthermore found to be independent of transcription or the transcription-coupled repair factor CSB, but dependent on cellular ATP. Our results demonstrate that the capacity to repair DNA is sufficient to remove exogenously induced damage. Thus, we conclude that the heterogeneous steady-state level of DNA damage in promoters and coding regions is caused by site-specific DNA damage/modifications that take place under normal metabolism. Copyright © 2014 Elsevier B.V. All rights reserved.
Beauty is in the efficient coding of the beholder.
Renoult, Julien P; Bovet, Jeanne; Raymond, Michel
2016-03-01
Sexual ornaments are often assumed to be indicators of mate quality. Yet it remains poorly known how certain ornaments are chosen before any coevolutionary race makes them indicative. Perceptual biases have been proposed to play this role, but known biases are mostly restricted to a specific taxon, which precludes evaluating their general importance in sexual selection. Here we identify a potentially universal perceptual bias in mate choice. We used an algorithm that models the sparseness of the activity of simple cells in the primary visual cortex (or V1) of humans when coding images of female faces. Sparseness was found positively correlated with attractiveness as rated by men and explained up to 17% of variance in attractiveness. Because V1 is adapted to process signals from natural scenes, in general, not faces specifically, our results indicate that attractiveness for female faces is influenced by a visual bias. Sparseness and more generally efficient neural coding are ubiquitous, occurring in various animals and sensory modalities, suggesting that the influence of efficient coding on mate choice can be widespread in animals.
Vector quantization for efficient coding of upper subbands
NASA Technical Reports Server (NTRS)
Zeng, W. J.; Huang, Y. F.
1994-01-01
This paper examines the application of vector quantization (VQ) to exploit both intra-band and inter-band redundancy in subband coding. The focus here is on the exploitation of inter-band dependency. It is shown that VQ is particularly suitable and effective for coding the upper subbands. Three subband decomposition-based VQ coding schemes are proposed here to exploit the inter-band dependency by making full use of the extra flexibility of VQ approach over scalar quantization. A quadtree-based variable rate VQ (VRVQ) scheme which takes full advantage of the intra-band and inter-band redundancy is first proposed. Then, a more easily implementable alternative based on an efficient block-based edge estimation technique is employed to overcome the implementational barriers of the first scheme. Finally, a predictive VQ scheme formulated in the context of finite state VQ is proposed to further exploit the dependency among different subbands. A VRVQ scheme proposed elsewhere is extended to provide an efficient bit allocation procedure. Simulation results show that these three hybrid techniques have advantages, in terms of peak signal-to-noise ratio (PSNR) and complexity, over other existing subband-VQ approaches.
Background-Modeling-Based Adaptive Prediction for Surveillance Video Coding.
Zhang, Xianguo; Huang, Tiejun; Tian, Yonghong; Gao, Wen
2014-02-01
The exponential growth of surveillance videos presents an unprecedented challenge for high-efficiency surveillance video coding technology. Compared with the existing coding standards that were basically developed for generic videos, surveillance video coding should be designed to make the best use of the special characteristics of surveillance videos (e.g., relative static background). To do so, this paper first conducts two analyses on how to improve the background and foreground prediction efficiencies in surveillance video coding. Following the analysis results, we propose a background-modeling-based adaptive prediction (BMAP) method. In this method, all blocks to be encoded are firstly classified into three categories. Then, according to the category of each block, two novel inter predictions are selectively utilized, namely, the background reference prediction (BRP) that uses the background modeled from the original input frames as the long-term reference and the background difference prediction (BDP) that predicts the current data in the background difference domain. For background blocks, the BRP can effectively improve the prediction efficiency using the higher quality background as the reference; whereas for foreground-background-hybrid blocks, the BDP can provide a better reference after subtracting its background pixels. Experimental results show that the BMAP can achieve at least twice the compression ratio on surveillance videos as AVC (MPEG-4 Advanced Video Coding) high profile, yet with a slightly additional encoding complexity. Moreover, for the foreground coding performance, which is crucial to the subjective quality of moving objects in surveillance videos, BMAP also obtains remarkable gains over several state-of-the-art methods.
Learning-Based Just-Noticeable-Quantization- Distortion Modeling for Perceptual Video Coding.
Ki, Sehwan; Bae, Sung-Ho; Kim, Munchurl; Ko, Hyunsuk
2018-07-01
Conventional predictive video coding-based approaches are reaching the limit of their potential coding efficiency improvements, because of severely increasing computation complexity. As an alternative approach, perceptual video coding (PVC) has attempted to achieve high coding efficiency by eliminating perceptual redundancy, using just-noticeable-distortion (JND) directed PVC. The previous JNDs were modeled by adding white Gaussian noise or specific signal patterns into the original images, which were not appropriate in finding JND thresholds due to distortion with energy reduction. In this paper, we present a novel discrete cosine transform-based energy-reduced JND model, called ERJND, that is more suitable for JND-based PVC schemes. Then, the proposed ERJND model is extended to two learning-based just-noticeable-quantization-distortion (JNQD) models as preprocessing that can be applied for perceptual video coding. The two JNQD models can automatically adjust JND levels based on given quantization step sizes. One of the two JNQD models, called LR-JNQD, is based on linear regression and determines the model parameter for JNQD based on extracted handcraft features. The other JNQD model is based on a convolution neural network (CNN), called CNN-JNQD. To our best knowledge, our paper is the first approach to automatically adjust JND levels according to quantization step sizes for preprocessing the input to video encoders. In experiments, both the LR-JNQD and CNN-JNQD models were applied to high efficiency video coding (HEVC) and yielded maximum (average) bitrate reductions of 38.51% (10.38%) and 67.88% (24.91%), respectively, with little subjective video quality degradation, compared with the input without preprocessing applied.
Microbial metatranscriptomics in a permanent marine oxygen minimum zone.
Stewart, Frank J; Ulloa, Osvaldo; DeLong, Edward F
2012-01-01
Simultaneous characterization of taxonomic composition, metabolic gene content and gene expression in marine oxygen minimum zones (OMZs) has potential to broaden perspectives on the microbial and biogeochemical dynamics in these environments. Here, we present a metatranscriptomic survey of microbial community metabolism in the Eastern Tropical South Pacific OMZ off northern Chile. Community RNA was sampled in late austral autumn from four depths (50, 85, 110, 200 m) extending across the oxycline and into the upper OMZ. Shotgun pyrosequencing of cDNA yielded 180,000 to 550,000 transcript sequences per depth. Based on functional gene representation, transcriptome samples clustered apart from corresponding metagenome samples from the same depth, highlighting the discrepancies between metabolic potential and actual transcription. BLAST-based characterizations of non-ribosomal RNA sequences revealed a dominance of genes involved with both oxidative (nitrification) and reductive (anammox, denitrification) components of the marine nitrogen cycle. Using annotations of protein-coding genes as proxies for taxonomic affiliation, we observed depth-specific changes in gene expression by key functional taxonomic groups. Notably, transcripts most closely matching the genome of the ammonia-oxidizing archaeon Nitrosopumilus maritimus dominated the transcriptome in the upper three depths, representing one in five protein-coding transcripts at 85 m. In contrast, transcripts matching the anammox bacterium Kuenenia stuttgartiensis dominated at the core of the OMZ (200 m; 1 in 12 protein-coding transcripts). The distribution of N. maritimus-like transcripts paralleled that of transcripts matching ammonia monooxygenase genes, which, despite being represented by both bacterial and archaeal sequences in the community DNA, were dominated (> 99%) by archaeal sequences in the RNA, suggesting a substantial role for archaeal nitrification in the upper OMZ. These data, as well as those describing other key OMZ metabolic processes (e.g. sulfur oxidation), highlight gene-specific expression patterns in the context of the entire community transcriptome, as well as identify key functional groups for taxon-specific genomic profiling. © 2011 Society for Applied Microbiology and Blackwell Publishing Ltd.
Disaster nephrology: medical perspective.
Atef-Zafarmand, Alireza; Fadem, Steve
2003-04-01
Disaster medicine is an extension of emergency medicine involving mass casualties and use of the best available techniques in search and rescue. To achieve the best results extensive predisaster preparedness is mandatory. Earthquakes have caused the loss of more than 1 million lives in the 20th century. Evidence-based medicine confirms that these deaths were mostly preventable based on experience in developed countries. The key to success is implementing building codes and structural reinforcement. In earthquakes as well as in collapse of buildings in bomb blasts, loss of life is either because of the direct effect of trauma or to the metabolic consequences of rhabdomyolysis and complications of its management. Hyperkalemia and infection are the commonest causes of death in victims who survive the direct effect of trauma. Acute renal failure, a grave complication of rhabdomyolysis, is mostly preventable by timely rehydration and bicarbonate therapy. Mannitol therapy can be very efficient in reducing the severity of muscle damage and its sequelae. Fasciotomy can be limb saving if it is done in the early hours, although a firm guideline is still lacking. Although each country is responsible for improving the structure of buildings and organizing an efficient disaster response, national and international organizations in developed countries should give high priority to communicating with developing countries to encourage their preparedness. Copyright 2003 by the National Kidney Foundation, Inc.
Linking metatranscriptomic to bioremediation processes of oil contaminated marine sediments
NASA Astrophysics Data System (ADS)
Cuny, P.; Atkinson, A.; Léa, S.; Guasco, S.; Jezequel, R.; Armougom, F.; Michotey, V.; Bonin, P.; Militon, C.
2016-02-01
Oil-derived hydrocarbons are one major source of pollution of marine ecosystems. In coastal marine areas they tend to accumulate in the sediment where they can impact the benthic communities. Oil hydrocarbons biodegradation by microorganisms is known to be one of the prevalent processes acting in the removal of these contaminants from sediments. The redox oscillation regimes generated by bioturbation, and the efficiency of metabolic coupling between functional groups associated to these specific redox regimes, are probably determinant factors controlling hydrocarbon biodegradation. Metatranscriptomic analysis appears like a promising approach to shed new light on the metabolic processes involved in the response of microbial communities to oil contamination in such oxic/anoxic oscillating environments. In the framework of the DECAPAGE project (ANR CESA-2011-006 01), funded by the French National Agency for Research, the metatranscriptomes (RNA-seq) of oil contaminated or not (Ural blend crude oil, 5 000 ppm) and bioturbated or not (addition of the common burrowing organism Hediste diversicolor, 1000 ind/m2) mudflat sediments, incubated in microcosms during 4 months at 19±1°C, were compared. The analysis of active microbial communities by SSU rRNA barcoding shows that the main observable changes are due to the presence of H. diversicolor. On the contrary, oil addition is the main factor explaining the observed changes in the genes expression patterns with 1949 genes specifically up or down-regulated (which is the case of only 245 genes when only H. diversicolor worms are added). In particular, the oil contamination leads to a marked overexpression (i) of benzyl- and alkylsuccinate synthase genes (ass and bss) that are involved in the anaerobic metabolism of aromatics (toluene) and alkanes, respectively and, (ii) of genes coding for nucleotide excision repair exonucleases indicating that DNA repair processes are also activated.
Enhanced isoprenoid production from xylose by engineered Saccharomyces cerevisiae.
Kwak, Suryang; Kim, Soo Rin; Xu, Haiqing; Zhang, Guo-Chang; Lane, Stephan; Kim, Heejin; Jin, Yong-Su
2017-11-01
Saccharomyces cerevisiae has limited capabilities for producing fuels and chemicals derived from acetyl-CoA, such as isoprenoids, due to a rigid flux partition toward ethanol during glucose metabolism. Despite numerous efforts, xylose fermentation by engineered yeast harboring heterologous xylose metabolic pathways was not as efficient as glucose fermentation for producing ethanol. Therefore, we hypothesized that xylose metabolism by engineered yeast might be a better fit for producing non-ethanol metabolites. We indeed found that engineered S. cerevisiae on xylose showed higher expression levels of the enzymes involved in ethanol assimilation and cytosolic acetyl-CoA synthesis than on glucose. When genetic perturbations necessary for overproducing squalene and amorphadiene were introduced into engineered S. cerevisiae capable of fermenting xylose, we observed higher titers and yields of isoprenoids under xylose than glucose conditions. Specifically, co-overexpression of a truncated HMG1 (tHMG1) and ERG10 led to substantially higher squalene accumulation under xylose than glucose conditions. In contrast to glucose utilization producing massive amounts of ethanol regardless of aeration, xylose utilization allowed much less amounts of ethanol accumulation, indicating ethanol is simultaneously re-assimilated with xylose consumption and utilized for the biosynthesis of cytosolic acetyl-CoA. In addition, xylose utilization by engineered yeast with overexpression of tHMG1, ERG10, and ADS coding for amorphadiene synthase, and the down-regulation of ERG9 resulted in enhanced amorphadiene production as compared to glucose utilization. These results suggest that the problem of the rigid flux partition toward ethanol production in yeast during the production of isoprenoids and other acetyl-CoA derived chemicals can be bypassed by using xylose instead of glucose as a carbon source. Biotechnol. Bioeng. 2017;114: 2581-2591. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Agave: a biofuel feedstock for arid and semi-arid environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gross, Stephen; Martin, Jeffrey; Simpson, June
2011-05-31
Efficient production of plant-based, lignocellulosic biofuels relies upon continued improvement of existing biofuel feedstock species, as well as the introduction of newfeedstocks capable of growing on marginal lands to avoid conflicts with existing food production and minimize use of water and nitrogen resources. To this end, specieswithin the plant genus Agave have recently been proposed as new biofuel feedstocks. Many Agave species are adapted to hot and arid environments generally unsuitable forfood production, yet have biomass productivity rates comparable to other second-generation biofuel feedstocks such as switchgrass and Miscanthus. Agavesachieve remarkable heat tolerance and water use efficiency in part throughmore » a Crassulacean Acid Metabolism (CAM) mode of photosynthesis, but the genes andregulatory pathways enabling CAM and thermotolerance in agaves remain poorly understood. We seek to accelerate the development of agave as a new biofuelfeedstock through genomic approaches using massively-parallel sequencing technologies. First, we plan to sequence the transcriptome of A. tequilana to provide adatabase of protein-coding genes to the agave research community. Second, we will compare transcriptome-wide gene expression of agaves under different environmentalconditions in order to understand genetic pathways controlling CAM, water use efficiency, and thermotolerance. Finally, we aim to compare the transcriptome of A.tequilana with that of other Agave species to gain further insight into molecular mechanisms underlying traits desirable for biofuel feedstocks. These genomicapproaches will provide sequence and gene expression information critical to the breeding and domestication of Agave species suitable for biofuel production.« less
Robledo, Marta; Peregrina, Alexandra; Millán, Vicenta; García-Tomsig, Natalia I; Torres-Quesada, Omar; Mateos, Pedro F; Becker, Anke; Jiménez-Zurdo, José I
2017-07-01
Small non-coding RNAs (sRNAs) are expected to have pivotal roles in the adaptive responses underlying symbiosis of nitrogen-fixing rhizobia with legumes. Here, we provide primary insights into the function and activity mechanism of the Sinorhizobium meliloti trans-sRNA NfeR1 (Nodule Formation Efficiency RNA). Northern blot probing and transcription tracking with fluorescent promoter-reporter fusions unveiled high nfeR1 expression in response to salt stress and throughout the symbiotic interaction. The strength and differential regulation of nfeR1 transcription are conferred by a motif, which is conserved in nfeR1 promoter regions in α-proteobacteria. NfeR1 loss-of-function compromised osmoadaptation of free-living bacteria, whilst causing misregulation of salt-responsive genes related to stress adaptation, osmolytes catabolism and membrane trafficking. Nodulation tests revealed that lack of NfeR1 affected competitiveness, infectivity, nodule development and symbiotic efficiency of S. meliloti on alfalfa roots. Comparative computer predictions and a genetic reporter assay evidenced a redundant role of three identical NfeR1 unpaired anti Shine-Dalgarno motifs for targeting and downregulation of translation of multiple mRNAs from transporter genes. Our data provide genetic evidence of the hyperosmotic conditions of the endosymbiotic compartments. NfeR1-mediated gene regulation in response to this cue could contribute to coordinate nutrient uptake with the metabolic reprogramming concomitant to symbiotic transitions. © 2017 Society for Applied Microbiology and John Wiley & Sons Ltd.
3D video coding: an overview of present and upcoming standards
NASA Astrophysics Data System (ADS)
Merkle, Philipp; Müller, Karsten; Wiegand, Thomas
2010-07-01
An overview of existing and upcoming 3D video coding standards is given. Various different 3D video formats are available, each with individual pros and cons. The 3D video formats can be separated into two classes: video-only formats (such as stereo and multiview video) and depth-enhanced formats (such as video plus depth and multiview video plus depth). Since all these formats exist of at least two video sequences and possibly additional depth data, efficient compression is essential for the success of 3D video applications and technologies. For the video-only formats the H.264 family of coding standards already provides efficient and widely established compression algorithms: H.264/AVC simulcast, H.264/AVC stereo SEI message, and H.264/MVC. For the depth-enhanced formats standardized coding algorithms are currently being developed. New and specially adapted coding approaches are necessary, as the depth or disparity information included in these formats has significantly different characteristics than video and is not displayed directly, but used for rendering. Motivated by evolving market needs, MPEG has started an activity to develop a generic 3D video standard within the 3DVC ad-hoc group. Key features of the standard are efficient and flexible compression of depth-enhanced 3D video representations and decoupling of content creation and display requirements.
Garst, Andrew D; Bassalo, Marcelo C; Pines, Gur; Lynch, Sean A; Halweg-Edwards, Andrea L; Liu, Rongming; Liang, Liya; Wang, Zhiwen; Zeitoun, Ramsey; Alexander, William G; Gill, Ryan T
2017-01-01
Improvements in DNA synthesis and sequencing have underpinned comprehensive assessment of gene function in bacteria and eukaryotes. Genome-wide analyses require high-throughput methods to generate mutations and analyze their phenotypes, but approaches to date have been unable to efficiently link the effects of mutations in coding regions or promoter elements in a highly parallel fashion. We report that CRISPR-Cas9 gene editing in combination with massively parallel oligomer synthesis can enable trackable editing on a genome-wide scale. Our method, CRISPR-enabled trackable genome engineering (CREATE), links each guide RNA to homologous repair cassettes that both edit loci and function as barcodes to track genotype-phenotype relationships. We apply CREATE to site saturation mutagenesis for protein engineering, reconstruction of adaptive laboratory evolution experiments, and identification of stress tolerance and antibiotic resistance genes in bacteria. We provide preliminary evidence that CREATE will work in yeast. We also provide a webtool to design multiplex CREATE libraries.
Construction of a lactose-assimilating strain of baker's yeast.
Adam, A C; Prieto, J A; Rubio-Texeira, M; Polaina, J
1999-09-30
A recombinant strain of baker's yeast has been constructed which can assimilate lactose efficiently. This strain has been designed to allow its propagation in whey, the byproduct resulting from cheese-making. The ability to metabolize lactose is conferred by the functional expression of two genes from Kluyveromyces lactis, LAC12 and LAC4, which encode a lactose permease and a beta-galactosidase, respectively. To make the recombinant strain more acceptable for its use in bread-making, the genetic transformation of the host baker's yeast was carried out with linear fragments of DNA of defined sequence, carrying as the only heterologous material the coding regions of the two K. lactis genes. Growth of the new strain on cheese whey affected neither the quality of bread nor the yeast gassing power. The significance of the newly developed strain is two-fold: it affords a cheap alternative to the procedure generally used for the propagation of baker's yeast, and it offers a profitable use for cheese whey. Copyright 1999 John Wiley & Sons, Ltd.
Comparative genomics of biotechnologically important yeasts
Riley, Robert; Haridas, Sajeet; Wolfe, Kenneth H.; Lopes, Mariana R.; Hittinger, Chris Todd; Göker, Markus; Salamov, Asaf A.; Wisecaver, Jennifer H.; Long, Tanya M.; Aerts, Andrea L.; Barry, Kerrie W.; Choi, Cindy; Clum, Alicia; Coughlan, Aisling Y.; Deshpande, Shweta; Douglass, Alexander P.; Hanson, Sara J.; Klenk, Hans-Peter; LaButti, Kurt M.; Lapidus, Alla; Lindquist, Erika A.; Lipzen, Anna M.; Meier-Kolthoff, Jan P.; Ohm, Robin A.; Otillar, Robert P.; Pangilinan, Jasmyn L.; Peng, Yi; Rosa, Carlos A.; Scheuner, Carmen; Sibirny, Andriy A.; Slot, Jason C.; Stielow, J. Benjamin; Sun, Hui; Kurtzman, Cletus P.; Blackwell, Meredith; Grigoriev, Igor V.
2016-01-01
Ascomycete yeasts are metabolically diverse, with great potential for biotechnology. Here, we report the comparative genome analysis of 29 taxonomically and biotechnologically important yeasts, including 16 newly sequenced. We identify a genetic code change, CUG-Ala, in Pachysolen tannophilus in the clade sister to the known CUG-Ser clade. Our well-resolved yeast phylogeny shows that some traits, such as methylotrophy, are restricted to single clades, whereas others, such as l-rhamnose utilization, have patchy phylogenetic distributions. Gene clusters, with variable organization and distribution, encode many pathways of interest. Genomics can predict some biochemical traits precisely, but the genomic basis of others, such as xylose utilization, remains unresolved. Our data also provide insight into early evolution of ascomycetes. We document the loss of H3K9me2/3 heterochromatin, the origin of ascomycete mating-type switching, and panascomycete synteny at the MAT locus. These data and analyses will facilitate the engineering of efficient biosynthetic and degradative pathways and gateways for genomic manipulation. PMID:27535936
Nakajima, Kazuki; Ito, Emi; Ohtsubo, Kazuaki; Shirato, Ken; Takamiya, Rina; Kitazume, Shinobu; Angata, Takashi; Taniguchi, Naoyuki
2013-01-01
Nucleotide sugars are the donor substrates of various glycosyltransferases, and an important building block in N- and O-glycan biosynthesis. Their intercellular concentrations are regulated by cellular metabolic states including diseases such as cancer and diabetes. To investigate the fate of UDP-GlcNAc, we developed a tracing method for UDP-GlcNAc synthesis and use, and GlcNAc utilization using 13C6-glucose and 13C2-glucosamine, respectively, followed by the analysis of mass isotopomers using LC-MS. Metabolic labeling of cultured cells with 13C6-glucose and the analysis of isotopomers of UDP-HexNAc (UDP-GlcNAc plus UDP-GalNAc) and CMP-NeuAc revealed the relative contributions of metabolic pathways leading to UDP-GlcNAc synthesis and use. In pancreatic insulinoma cells, the labeling efficiency of a 13C6-glucose motif in CMP-NeuAc was lower compared with that in hepatoma cells. Using 13C2-glucosamine, the diversity of the labeling efficiency was observed in each sugar residue of N- and O-glycans on the basis of isotopomer analysis. In the insulinoma cells, the low labeling efficiencies were found for sialic acids as well as tri- and tetra-sialo N-glycans, whereas asialo N-glycans were found to be abundant. Essentially no significant difference in secreted hyaluronic acids was found among hepatoma and insulinoma cell lines. This indicates that metabolic flows are responsible for the low sialylation in the insulinoma cells. Our strategy should be useful for systematically tracing each stage of cellular GlcNAc metabolism. PMID:23720760
Solute Model or Cellular Energy Model: Practical and Theoretical Aspects of Thirst During Exercise
1989-02-16
are two weaker inhibitors of Na-K ATPase. D20 had the same inhibitory effects when 22 used as the solvent for hypertonic saline in goats (31, 53... Effect on metabolic activity. Am. J. Physiol. 165: 113-127, 1951. 38. Olsson, K. Studies on central regulation of secretion of antidiuretic hormone...of 1.4 osmoles of metabolic end-products (mostly Codes ~i urea and surplus electrolytes) per liter of urine on a mixed European-style diet. Thus, the
Trellises and Trellis-Based Decoding Algorithms for Linear Block Codes. Part 3
NASA Technical Reports Server (NTRS)
Lin, Shu
1998-01-01
Decoding algorithms based on the trellis representation of a code (block or convolutional) drastically reduce decoding complexity. The best known and most commonly used trellis-based decoding algorithm is the Viterbi algorithm. It is a maximum likelihood decoding algorithm. Convolutional codes with the Viterbi decoding have been widely used for error control in digital communications over the last two decades. This chapter is concerned with the application of the Viterbi decoding algorithm to linear block codes. First, the Viterbi algorithm is presented. Then, optimum sectionalization of a trellis to minimize the computational complexity of a Viterbi decoder is discussed and an algorithm is presented. Some design issues for IC (integrated circuit) implementation of a Viterbi decoder are considered and discussed. Finally, a new decoding algorithm based on the principle of compare-select-add is presented. This new algorithm can be applied to both block and convolutional codes and is more efficient than the conventional Viterbi algorithm based on the add-compare-select principle. This algorithm is particularly efficient for rate 1/n antipodal convolutional codes and their high-rate punctured codes. It reduces computational complexity by one-third compared with the Viterbi algorithm.
Nuclear shell model code CRUNCHER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Resler, D.A.; Grimes, S.M.
1988-05-01
A new nuclear shell model code CRUNCHER, patterned after the code VLADIMIR, has been developed. While CRUNCHER and VLADIMIR employ the techniques of an uncoupled basis and the Lanczos process, improvements in the new code allow it to handle much larger problems than the previous code and to perform them more efficiently. Tests involving a moderately sized calculation indicate that CRUNCHER running on a SUN 3/260 workstation requires approximately one-half the central processing unit (CPU) time required by VLADIMIR running on a CRAY-1 supercomputer.
Selective reduction of N-oxides to amines: application to drug metabolism.
Kulanthaivel, Palaniappan; Barbuch, Robert J; Davidson, Rita S; Yi, Ping; Rener, Gregory A; Mattiuz, Edward L; Hadden, Chad E; Goodwin, Lawrence A; Ehlhardt, William J
2004-09-01
Phase I oxidative metabolism of nitrogen-containing drug molecules to their corresponding N-oxides is a common occurrence. There are instances where liquid chromatography/tandem mass spectometry techniques are inadequate to distinguish this pathway from other oxidation processes, including C-hydroxylations and other heteroatom oxidations, such as sulfur to sulfoxide. Therefore, the purpose of the present study was to develop and optimize an efficient and practical chemical method to selectively convert N-oxides to their corresponding amines suitable for drug metabolism applications. Our results indicated that efficient conversion of N-oxides to amines could be achieved with TiCl(3) and poly(methylhydrosiloxane). Among them, we found TiCl(3) to be a facile and easy-to-use reagent, specifically applicable to drug metabolism. There are a few reports describing the use of TiCl(3) to reduce N-O bonds in drug metabolism studies, but this methodology has not been widely used. Our results indicated that TiCl(3) is nearly as efficient when the reductions were carried out in the presence of biological matrices, including plasma and urine. Finally, we have shown a number of examples where TiCl(3) can be successfully used to selectively reduce N-oxides in the presence of sulfoxides and other labile groups.
Lu, Yi; Chen, Bin; Feng, Kuishuang; Hubacek, Klaus
2015-06-16
Energy production and industrial processes are crucial economic sectors accounting for about 62% of greenhouse gas (GHG) emissions globally in 2012. Eco-industrial parks are practical attempts to mitigate GHG emissions through cooperation among businesses and the local community in order to reduce waste and pollution, efficiently share resources, and help with the pursuit of sustainable development. This work developed a framework based on ecological network analysis to trace carbon metabolic processes in eco-industrial parks and applied it to a typical eco-industrial park in Beijing. Our findings show that the entire metabolic system is dominated by supply of primary goods from the external environment and final demand. The more carbon flows through a sector, the more influence it would exert upon the whole system. External environment and energy providers are the most active and dominating part of the carbon metabolic system, which should be the first target to mitigate emissions by increasing efficiencies. The carbon metabolism of the eco-industrial park can be seen as an evolutionary system with high levels of efficiency, but this may come at the expense of larger levels of resilience. This work may provide a useful modeling framework for low-carbon design and management of industrial parks.
Optimal Near-Hitless Network Failure Recovery Using Diversity Coding
ERIC Educational Resources Information Center
Avci, Serhat Nazim
2013-01-01
Link failures in wide area networks are common and cause significant data losses. Mesh-based protection schemes offer high capacity efficiency but they are slow, require complex signaling, and instable. Diversity coding is a proactive coding-based recovery technique which offers near-hitless (sub-ms) restoration with a competitive spare capacity…
Visual Search Asymmetries within Color-Coded and Intensity-Coded Displays
ERIC Educational Resources Information Center
Yamani, Yusuke; McCarley, Jason S.
2010-01-01
Color and intensity coding provide perceptual cues to segregate categories of objects within a visual display, allowing operators to search more efficiently for needed information. Even within a perceptually distinct subset of display elements, however, it may often be useful to prioritize items representing urgent or task-critical information.…
Computer Code Aids Design Of Wings
NASA Technical Reports Server (NTRS)
Carlson, Harry W.; Darden, Christine M.
1993-01-01
AERO2S computer code developed to aid design engineers in selection and evaluation of aerodynamically efficient wing/canard and wing/horizontal-tail configurations that includes simple hinged-flap systems. Code rapidly estimates longitudinal aerodynamic characteristics of conceptual airplane lifting-surface arrangements. Developed in FORTRAN V on CDC 6000 computer system, and ported to MS-DOS environment.
Error Control Coding Techniques for Space and Satellite Communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.; Takeshita, Oscar Y.; Cabral, Hermano A.; He, Jiali; White, Gregory S.
1997-01-01
Turbo coding using iterative SOVA decoding and M-ary differentially coherent or non-coherent modulation can provide an effective coding modulation solution: (1) Energy efficient with relatively simple SOVA decoding and small packet lengths, depending on BEP required; (2) Low number of decoding iterations required; and (3) Robustness in fading with channel interleaving.
LSENS, The NASA Lewis Kinetics and Sensitivity Analysis Code
NASA Technical Reports Server (NTRS)
Radhakrishnan, K.
2000-01-01
A general chemical kinetics and sensitivity analysis code for complex, homogeneous, gas-phase reactions is described. The main features of the code, LSENS (the NASA Lewis kinetics and sensitivity analysis code), are its flexibility, efficiency and convenience in treating many different chemical reaction models. The models include: static system; steady, one-dimensional, inviscid flow; incident-shock initiated reaction in a shock tube; and a perfectly stirred reactor. In addition, equilibrium computations can be performed for several assigned states. An implicit numerical integration method (LSODE, the Livermore Solver for Ordinary Differential Equations), which works efficiently for the extremes of very fast and very slow reactions, is used to solve the "stiff" ordinary differential equation systems that arise in chemical kinetics. For static reactions, the code uses the decoupled direct method to calculate sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of dependent variables and/or the rate coefficient parameters. Solution methods for the equilibrium and post-shock conditions and for perfectly stirred reactor problems are either adapted from or based on the procedures built into the NASA code CEA (Chemical Equilibrium and Applications).
Medical Ultrasound Video Coding with H.265/HEVC Based on ROI Extraction
Wu, Yueying; Liu, Pengyu; Gao, Yuan; Jia, Kebin
2016-01-01
High-efficiency video compression technology is of primary importance to the storage and transmission of digital medical video in modern medical communication systems. To further improve the compression performance of medical ultrasound video, two innovative technologies based on diagnostic region-of-interest (ROI) extraction using the high efficiency video coding (H.265/HEVC) standard are presented in this paper. First, an effective ROI extraction algorithm based on image textural features is proposed to strengthen the applicability of ROI detection results in the H.265/HEVC quad-tree coding structure. Second, a hierarchical coding method based on transform coefficient adjustment and a quantization parameter (QP) selection process is designed to implement the otherness encoding for ROIs and non-ROIs. Experimental results demonstrate that the proposed optimization strategy significantly improves the coding performance by achieving a BD-BR reduction of 13.52% and a BD-PSNR gain of 1.16 dB on average compared to H.265/HEVC (HM15.0). The proposed medical video coding algorithm is expected to satisfy low bit-rate compression requirements for modern medical communication systems. PMID:27814367
Medical Ultrasound Video Coding with H.265/HEVC Based on ROI Extraction.
Wu, Yueying; Liu, Pengyu; Gao, Yuan; Jia, Kebin
2016-01-01
High-efficiency video compression technology is of primary importance to the storage and transmission of digital medical video in modern medical communication systems. To further improve the compression performance of medical ultrasound video, two innovative technologies based on diagnostic region-of-interest (ROI) extraction using the high efficiency video coding (H.265/HEVC) standard are presented in this paper. First, an effective ROI extraction algorithm based on image textural features is proposed to strengthen the applicability of ROI detection results in the H.265/HEVC quad-tree coding structure. Second, a hierarchical coding method based on transform coefficient adjustment and a quantization parameter (QP) selection process is designed to implement the otherness encoding for ROIs and non-ROIs. Experimental results demonstrate that the proposed optimization strategy significantly improves the coding performance by achieving a BD-BR reduction of 13.52% and a BD-PSNR gain of 1.16 dB on average compared to H.265/HEVC (HM15.0). The proposed medical video coding algorithm is expected to satisfy low bit-rate compression requirements for modern medical communication systems.
Comparative genomics of biotechnologically important yeasts
USDA-ARS?s Scientific Manuscript database
Ascomycete yeasts are metabolically diverse, with great potential for biotechnology. Here, we report the comparative genome analysis of 29 taxonomically and biotechnologically important yeasts, including 16 newly sequenced. We identify a genetic code change, CUG-Ala, in Pachysolen tannophilus in the...
Classification Techniques for Digital Map Compression
1989-03-01
classification improved the performance of the K-means classification algorithm resulting in a compression of 8.06:1 with Lempel - Ziv coding. Run-length coding... compression performance are run-length coding [2], [8] and Lempel - Ziv coding 110], [11]. These techniques are chosen because they are most efficient when...investigated. After the classification, some standard file compression methods, such as Lempel - Ziv and run-length encoding were applied to the
A long-term, integrated impact assessment of alternative building energy code scenarios in China
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Sha; Eom, Jiyong; Evans, Meredydd
2014-04-01
China is the second largest building energy user in the world, ranking first and third in residential and commercial energy consumption. Beginning in the early 1980s, the Chinese government has developed a variety of building energy codes to improve building energy efficiency and reduce total energy demand. This paper studies the impact of building energy codes on energy use and CO2 emissions by using a detailed building energy model that represents four distinct climate zones each with three building types, nested in a long-term integrated assessment framework GCAM. An advanced building stock module, coupled with the building energy model, ismore » developed to reflect the characteristics of future building stock and its interaction with the development of building energy codes in China. This paper also evaluates the impacts of building codes on building energy demand in the presence of economy-wide carbon policy. We find that building energy codes would reduce Chinese building energy use by 13% - 22% depending on building code scenarios, with a similar effect preserved even under the carbon policy. The impact of building energy codes shows regional and sectoral variation due to regionally differentiated responses of heating and cooling services to shell efficiency improvement.« less
Welter, David E.; Doherty, John E.; Hunt, Randall J.; Muffels, Christopher T.; Tonkin, Matthew J.; Schreuder, Willem A.
2012-01-01
An object-oriented parameter estimation code was developed to incorporate benefits of object-oriented programming techniques for solving large parameter estimation modeling problems. The code is written in C++ and is a formulation and expansion of the algorithms included in PEST, a widely used parameter estimation code written in Fortran. The new code is called PEST++ and is designed to lower the barriers of entry for users and developers while providing efficient algorithms that can accommodate large, highly parameterized problems. This effort has focused on (1) implementing the most popular features of PEST in a fashion that is easy for novice or experienced modelers to use and (2) creating a software design that is easy to extend; that is, this effort provides a documented object-oriented framework designed from the ground up to be modular and extensible. In addition, all PEST++ source code and its associated libraries, as well as the general run manager source code, have been integrated in the Microsoft Visual Studio® 2010 integrated development environment. The PEST++ code is designed to provide a foundation for an open-source development environment capable of producing robust and efficient parameter estimation tools for the environmental modeling community into the future.
Russ, Daniel E.; Ho, Kwan-Yuet; Colt, Joanne S.; Armenti, Karla R.; Baris, Dalsu; Chow, Wong-Ho; Davis, Faith; Johnson, Alison; Purdue, Mark P.; Karagas, Margaret R.; Schwartz, Kendra; Schwenn, Molly; Silverman, Debra T.; Johnson, Calvin A.; Friesen, Melissa C.
2016-01-01
Background Mapping job titles to standardized occupation classification (SOC) codes is an important step in identifying occupational risk factors in epidemiologic studies. Because manual coding is time-consuming and has moderate reliability, we developed an algorithm called SOCcer (Standardized Occupation Coding for Computer-assisted Epidemiologic Research) to assign SOC-2010 codes based on free-text job description components. Methods Job title and task-based classifiers were developed by comparing job descriptions to multiple sources linking job and task descriptions to SOC codes. An industry-based classifier was developed based on the SOC prevalence within an industry. These classifiers were used in a logistic model trained using 14,983 jobs with expert-assigned SOC codes to obtain empirical weights for an algorithm that scored each SOC/job description. We assigned the highest scoring SOC code to each job. SOCcer was validated in two occupational data sources by comparing SOC codes obtained from SOCcer to expert assigned SOC codes and lead exposure estimates obtained by linking SOC codes to a job-exposure matrix. Results For 11,991 case-control study jobs, SOCcer-assigned codes agreed with 44.5% and 76.3% of manually assigned codes at the 6- and 2-digit level, respectively. Agreement increased with the score, providing a mechanism to identify assignments needing review. Good agreement was observed between lead estimates based on SOCcer and manual SOC assignments (kappa: 0.6–0.8). Poorer performance was observed for inspection job descriptions, which included abbreviations and worksite-specific terminology. Conclusions Although some manual coding will remain necessary, using SOCcer may improve the efficiency of incorporating occupation into large-scale epidemiologic studies. PMID:27102331
The components of crop productivity: measuring and modeling plant metabolism
NASA Technical Reports Server (NTRS)
Bugbee, B.
1995-01-01
Several investigators in the CELSS program have demonstrated that crop plants can be remarkably productive in optimal environments where plants are limited only by incident radiation. Radiation use efficiencies of 0.4 to 0.7 g biomass per mol of incident photons have been measured for crops in several laboratories. Some early published values for radiation use efficiency (1 g mol-1) were inflated due to the effect of side lighting. Sealed chambers are the basic research module for crop studies for space. Such chambers allow the measurement of radiation and CO2 fluxes, thus providing values for three determinants of plant growth: radiation absorption, photosynthetic efficiency (quantum yield), and respiration efficiency (carbon use efficiency). Continuous measurement of each of these parameters over the plant life cycle has provided a blueprint for daily growth rates, and is the basis for modeling crop productivity based on component metabolic processes. Much of what has been interpreted as low photosynthetic efficiency is really the result of reduced leaf expansion and poor radiation absorption. Measurements and models of short-term (minutes to hours) and long-term (days to weeks) plant metabolic rates have enormously improved our understanding of plant environment interactions in ground-based growth chambers and are critical to understanding plant responses to the space environment.
Catabolic efficiency of aerobic glycolysis: the Warburg effect revisited.
Vazquez, Alexei; Liu, Jiangxia; Zhou, Yi; Oltvai, Zoltán N
2010-05-06
Cancer cells simultaneously exhibit glycolysis with lactate secretion and mitochondrial respiration even in the presence of oxygen, a phenomenon known as the Warburg effect. The maintenance of this mixed metabolic phenotype is seemingly counterintuitive given that aerobic glycolysis is far less efficient in terms of ATP yield per moles of glucose than mitochondrial respiration. Here, we resolve this apparent contradiction by expanding the notion of metabolic efficiency. We study a reduced flux balance model of ATP production that is constrained by the glucose uptake capacity and by the solvent capacity of the cell's cytoplasm, the latter quantifying the maximum amount of macromolecules that can occupy the intracellular space. At low glucose uptake rates we find that mitochondrial respiration is indeed the most efficient pathway for ATP generation. Above a threshold glucose uptake rate, however, a gradual activation of aerobic glycolysis and slight decrease of mitochondrial respiration results in the highest rate of ATP production. Our analyses indicate that the Warburg effect is a favorable catabolic state for all rapidly proliferating mammalian cells with high glucose uptake capacity. It arises because while aerobic glycolysis is less efficient than mitochondrial respiration in terms of ATP yield per glucose uptake, it is more efficient in terms of the required solvent capacity. These results may have direct relevance to chemotherapeutic strategies attempting to target cancer metabolism.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Sha; Tan, Qing; Evans, Meredydd
India is expected to add 40 billion m2 of new buildings till 2050. Buildings are responsible for one third of India’s total energy consumption today and building energy use is expected to continue growing driven by rapid income and population growth. The implementation of the Energy Conservation Building Code (ECBC) is one of the measures to improve building energy efficiency. Using the Global Change Assessment Model, this study assesses growth in the buildings sector and impacts of building energy policies in Gujarat, which would help the state adopt ECBC and expand building energy efficiency programs. Without building energy policies, buildingmore » energy use in Gujarat would grow by 15 times in commercial buildings and 4 times in urban residential buildings between 2010 and 2050. ECBC improves energy efficiency in commercial buildings and could reduce building electricity use in Gujarat by 20% in 2050, compared to the no policy scenario. Having energy codes for both commercial and residential buildings could result in additional 10% savings in electricity use. To achieve these intended savings, it is critical to build capacity and institution for robust code implementation.« less
McNamara, J P
2015-12-01
A major role of the dairy cow is to convert low-quality plant materials into high-quality protein and other nutrients for humans. We must select and manage cows with the goal of having animals of the greatest efficiency matched to their environment. We have increased efficiency tremendously over the years, yet the variation in productive and reproductive efficiency among animals is still large. In part, this is because of a lack of full integration of genetic, nutritional, and reproductive biology into management decisions. However, integration across these disciplines is increasing as the biological research findings show specific control points at which genetics, nutrition, and reproduction interact. An ordered systems biology approach that focuses on why and how cells regulate energy and N use and on how and why organs interact through endocrine and neurocrine mechanisms will speed improvements in efficiency. More sophisticated dairy managers will demand better information to improve the efficiency of their animals. Using genetic improvement and animal management to improve milk productive and reproductive efficiency requires a deeper understanding of metabolic processes throughout the life cycle. Using existing metabolic models, we can design experiments specifically to integrate data from global transcriptional profiling into models that describe nutrient use in farm animals. A systems modeling approach can help focus our research to make faster and larger advances in efficiency and determine how this knowledge can be applied on the farms.
NASA Astrophysics Data System (ADS)
Kajimoto, Tsuyoshi; Shigyo, Nobuhiro; Sanami, Toshiya; Ishibashi, Kenji; Haight, Robert C.; Fotiades, Nikolaos
2011-02-01
Absolute neutron response functions and detection efficiencies of an NE213 liquid scintillator that was 12.7 cm in diameter and 12.7 cm in thickness were measured for neutron energies between 15 and 600 MeV at the Weapons Neutron Research facility of the Los Alamos Neutron Science Center. The experiment was performed with continuous-energy neutrons on a spallation neutron source by 800-MeV proton incidence. The incident neutron flux was measured using a 238U fission ionization chamber. Measured response functions and detection efficiencies were compared with corresponding calculations using the SCINFUL-QMD code. The calculated and experimental values were in good agreement for data below 70 MeV. However, there were discrepancies in the energy region between 70 and 150 MeV. Thus, the code was partly modified and the revised code provided better agreement with the experimental data.
Some practical universal noiseless coding techniques, part 2
NASA Technical Reports Server (NTRS)
Rice, R. F.; Lee, J. J.
1983-01-01
This report is an extension of earlier work (Part 1) which provided practical adaptive techniques for the efficient noiseless coding of a broad class of data sources characterized by only partially known and varying statistics (JPL Publication 79-22). The results here, while still claiming such general applicability, focus primarily on the noiseless coding of image data. A fairly complete and self-contained treatment is provided. Particular emphasis is given to the requirements of the forthcoming Voyager II encounters of Uranus and Neptune. Performance evaluations are supported both graphically and pictorially. Expanded definitions of the algorithms in Part 1 yield a computationally improved set of options for applications requiring efficient performance at entropies above 4 bits/sample. These expanded definitions include as an important subset, a somewhat less efficient but extremely simple "FAST' compressor which will be used at the Voyager Uranus encounter. Additionally, options are provided which enhance performance when atypical data spikes may be present.
A generic efficient adaptive grid scheme for rocket propulsion modeling
NASA Technical Reports Server (NTRS)
Mo, J. D.; Chow, Alan S.
1993-01-01
The objective of this research is to develop an efficient, time-accurate numerical algorithm to discretize the Navier-Stokes equations for the predictions of internal one-, two-dimensional and axisymmetric flows. A generic, efficient, elliptic adaptive grid generator is implicitly coupled with the Lower-Upper factorization scheme in the development of ALUNS computer code. The calculations of one-dimensional shock tube wave propagation and two-dimensional shock wave capture, wave-wave interactions, shock wave-boundary interactions show that the developed scheme is stable, accurate and extremely robust. The adaptive grid generator produced a very favorable grid network by a grid speed technique. This generic adaptive grid generator is also applied in the PARC and FDNS codes and the computational results for solid rocket nozzle flowfield and crystal growth modeling by those codes will be presented in the conference, too. This research work is being supported by NASA/MSFC.
Hao, Kun; Jin, Zhigang; Shen, Haifeng; Wang, Ying
2015-05-28
Efficient routing protocols for data packet delivery are crucial to underwater sensor networks (UWSNs). However, communication in UWSNs is a challenging task because of the characteristics of the acoustic channel. Network coding is a promising technique for efficient data packet delivery thanks to the broadcast nature of acoustic channels and the relatively high computation capabilities of the sensor nodes. In this work, we present GPNC, a novel geographic routing protocol for UWSNs that incorporates partial network coding to encode data packets and uses sensor nodes' location information to greedily forward data packets to sink nodes. GPNC can effectively reduce network delays and retransmissions of redundant packets causing additional network energy consumption. Simulation results show that GPNC can significantly improve network throughput and packet delivery ratio, while reducing energy consumption and network latency when compared with other routing protocols.
Performance evaluation of the intra compression in the video coding standards
NASA Astrophysics Data System (ADS)
Abramowski, Andrzej
2015-09-01
The article presents a comparison of the Intra prediction algorithms in the current state-of-the-art video coding standards, including MJPEG 2000, VP8, VP9, H.264/AVC and H.265/HEVC. The effectiveness of techniques employed by each standard is evaluated in terms of compression efficiency and average encoding time. The compression efficiency is measured using BD-PSNR and BD-RATE metrics with H.265/HEVC results as an anchor. Tests are performed on a set of video sequences, composed of sequences gathered by Joint Collaborative Team on Video Coding during the development of the H.265/HEVC standard and 4K sequences provided by Ultra Video Group. According to results, H.265/HEVC provides significant bit-rate savings at the expense of computational complexity, while VP9 may be regarded as a compromise between the efficiency and required encoding time.
A European mobile satellite system concept exploiting CDMA and OBP
NASA Technical Reports Server (NTRS)
Vernucci, A.; Craig, A. D.
1993-01-01
This paper describes a novel Land Mobile Satellite System (LMSS) concept applicable to networks allowing access to a large number of gateway stations ('Hubs'), utilizing low-cost Very Small Aperture Terminals (VSAT's). Efficient operation of the Forward-Link (FL) repeater can be achieved by adopting a synchronous Code Division Multiple Access (CDMA) technique, whereby inter-code interference (self-noise) is virtually eliminated by synchronizing orthogonal codes. However, with a transparent FL repeater, the requirements imposed by the highly decentralized ground segment can lead to significant efficiency losses. The adoption of a FL On-Board Processing (OBP) repeater is proposed as a means of largely recovering this efficiency impairment. The paper describes the network architecture, the system design and performance, the OBP functions and impact on implementation. The proposed concept, applicable to a future generation of the European LMSS, was developed in the context of a European Space Agency (ESA) study contract.
Prigent, Sylvain; Nielsen, Jens Christian; Frisvad, Jens Christian; Nielsen, Jens
2018-06-05
Modelling of metabolism at the genome-scale have proved to be an efficient method for explaining observed phenotypic traits in living organisms. Further, it can be used as a means of predicting the effect of genetic modifications e.g. for development of microbial cell factories. With the increasing amount of genome sequencing data available, a need exists to accurately and efficiently generate such genome-scale metabolic models (GEMs) of non-model organisms, for which data is sparse. In this study, we present an automatic reconstruction approach applied to 24 Penicillium species, which have potential for production of pharmaceutical secondary metabolites or used in the manufacturing of food products such as cheeses. The models were based on the MetaCyc database and a previously published Penicillium GEM, and gave rise to comprehensive genome-scale metabolic descriptions. The models proved that while central carbon metabolism is highly conserved, secondary metabolic pathways represent the main diversity among the species. The automatic reconstruction approach presented in this study can be applied to generate GEMs of other understudied organisms, and the developed GEMs are a useful resource for the study of Penicillium metabolism, for example with the scope of developing novel cell factories. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Wang, Mo; Ling, Jie; Chen, Ying; Song, Jie; Sun, E; Shi, Zi-Qi; Feng, Liang; Jia, Xiao-Bin; Wei, Ying-Jie
2017-11-01
The increasingly apparent liver injury problems of bone strengthening Chinese medicines have brought challenges for clinical application, and it is necessary to consider both effectiveness and safety in screening anti-osteoporosis Chinese medicines. Metabolic transformation is closely related to drug efficacy and toxicity, so it is significant to comprehensively consider metabolism-action/toxicity(M-Act/Tox) for screening anti-osteoporosis Chinese medicines. The current evaluation models and the number of compounds(including metabolites) severely restrict efficient screening in vivo. By referring to previous relevant research and domestic and abroad literature, zebrafish M-Act/Tox integrative method was put forward for efficiently screening anti-osteoporosis herb medicines, which has organically integrated zebrafish metabolism model, osteoporosis model and toxicity evaluation method. This method can break through the bottleneck and blind spots that trace compositions can't achieve efficient and integrated in vivo evaluation, and realize both efficient and comprehensive screening on anti-osteoporosis traditional medicines based on in vivo process taking both safety and effectiveness into account, which is significant to accelerate discovery of effective and safe innovative traditional Chinese medicines for osteoporosis. Copyright© by the Chinese Pharmaceutical Association.
Reliable quantum communication over a quantum relay channel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gyongyosi, Laszlo, E-mail: gyongyosi@hit.bme.hu; Imre, Sandor
2014-12-04
We show that reliable quantum communication over an unreliable quantum relay channels is possible. The coding scheme combines the results on the superadditivity of quantum channels and the efficient quantum coding approaches.
Energy efficient rateless codes for high speed data transfer over free space optical channels
NASA Astrophysics Data System (ADS)
Prakash, Geetha; Kulkarni, Muralidhar; Acharya, U. S.
2015-03-01
Terrestrial Free Space Optical (FSO) links transmit information by using the atmosphere (free space) as a medium. In this paper, we have investigated the use of Luby Transform (LT) codes as a means to mitigate the effects of data corruption induced by imperfect channel which usually takes the form of lost or corrupted packets. LT codes, which are a class of Fountain codes, can be used independent of the channel rate and as many code words as required can be generated to recover all the message bits irrespective of the channel performance. Achieving error free high data rates with limited energy resources is possible with FSO systems if error correction codes with minimal overheads on the power can be used. We also employ a combination of Binary Phase Shift Keying (BPSK) with provision for modification of threshold and optimized LT codes with belief propagation for decoding. These techniques provide additional protection even under strong turbulence regimes. Automatic Repeat Request (ARQ) is another method of improving link reliability. Performance of ARQ is limited by the number of retransmissions and the corresponding time delay. We prove through theoretical computations and simulations that LT codes consume less energy per bit. We validate the feasibility of using energy efficient LT codes over ARQ for FSO links to be used in optical wireless sensor networks within the eye safety limits.
Efficient Modeling of Laser-Plasma Accelerators with INF&RNO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benedetti, C.; Schroeder, C. B.; Esarey, E.
2010-06-01
The numerical modeling code INF&RNO (INtegrated Fluid& paRticle simulatioN cOde, pronounced"inferno") is presented. INF&RNO is an efficient 2D cylindrical code to model the interaction of a short laser pulse with an underdense plasma. The code is based on an envelope model for the laser while either a PIC or a fluid description can be used for the plasma. The effect of the laser pulse on the plasma is modeled with the time-averaged poderomotive force. These and other features allow for a speedup of 2-4 orders of magnitude compared to standard full PIC simulations while still retaining physical fidelity. The codemore » has been benchmarked against analytical solutions and 3D PIC simulations and here a set of validation tests together with a discussion of the performances are presented.« less
Viterbi decoding for satellite and space communication.
NASA Technical Reports Server (NTRS)
Heller, J. A.; Jacobs, I. M.
1971-01-01
Convolutional coding and Viterbi decoding, along with binary phase-shift keyed modulation, is presented as an efficient system for reliable communication on power limited satellite and space channels. Performance results, obtained theoretically and through computer simulation, are given for optimum short constraint length codes for a range of code constraint lengths and code rates. System efficiency is compared for hard receiver quantization and 4 and 8 level soft quantization. The effects on performance of varying of certain parameters relevant to decoder complexity and cost are examined. Quantitative performance degradation due to imperfect carrier phase coherence is evaluated and compared to that of an uncoded system. As an example of decoder performance versus complexity, a recently implemented 2-Mbit/sec constraint length 7 Viterbi decoder is discussed. Finally a comparison is made between Viterbi and sequential decoding in terms of suitability to various system requirements.
Interactive Exploration for Continuously Expanding Neuron Databases.
Li, Zhongyu; Metaxas, Dimitris N; Lu, Aidong; Zhang, Shaoting
2017-02-15
This paper proposes a novel framework to help biologists explore and analyze neurons based on retrieval of data from neuron morphological databases. In recent years, the continuously expanding neuron databases provide a rich source of information to associate neuronal morphologies with their functional properties. We design a coarse-to-fine framework for efficient and effective data retrieval from large-scale neuron databases. In the coarse-level, for efficiency in large-scale, we employ a binary coding method to compress morphological features into binary codes of tens of bits. Short binary codes allow for real-time similarity searching in Hamming space. Because the neuron databases are continuously expanding, it is inefficient to re-train the binary coding model from scratch when adding new neurons. To solve this problem, we extend binary coding with online updating schemes, which only considers the newly added neurons and update the model on-the-fly, without accessing the whole neuron databases. In the fine-grained level, we introduce domain experts/users in the framework, which can give relevance feedback for the binary coding based retrieval results. This interactive strategy can improve the retrieval performance through re-ranking the above coarse results, where we design a new similarity measure and take the feedback into account. Our framework is validated on more than 17,000 neuron cells, showing promising retrieval accuracy and efficiency. Moreover, we demonstrate its use case in assisting biologists to identify and explore unknown neurons. Copyright © 2017 Elsevier Inc. All rights reserved.
Weighted bi-prediction for light field image coding
NASA Astrophysics Data System (ADS)
Conti, Caroline; Nunes, Paulo; Ducla Soares, Luís.
2017-09-01
Light field imaging based on a single-tier camera equipped with a microlens array - also known as integral, holoscopic, and plenoptic imaging - has currently risen up as a practical and prospective approach for future visual applications and services. However, successfully deploying actual light field imaging applications and services will require developing adequate coding solutions to efficiently handle the massive amount of data involved in these systems. In this context, self-similarity compensated prediction is a non-local spatial prediction scheme based on block matching that has been shown to achieve high efficiency for light field image coding based on the High Efficiency Video Coding (HEVC) standard. As previously shown by the authors, this is possible by simply averaging two predictor blocks that are jointly estimated from a causal search window in the current frame itself, referred to as self-similarity bi-prediction. However, theoretical analyses for motion compensated bi-prediction have suggested that it is still possible to achieve further rate-distortion performance improvements by adaptively estimating the weighting coefficients of the two predictor blocks. Therefore, this paper presents a comprehensive study of the rate-distortion performance for HEVC-based light field image coding when using different sets of weighting coefficients for self-similarity bi-prediction. Experimental results demonstrate that it is possible to extend the previous theoretical conclusions to light field image coding and show that the proposed adaptive weighting coefficient selection leads to up to 5 % of bit savings compared to the previous self-similarity bi-prediction scheme.
Correlated activity supports efficient cortical processing
Hung, Chou P.; Cui, Ding; Chen, Yueh-peng; Lin, Chia-pei; Levine, Matthew R.
2015-01-01
Visual recognition is a computational challenge that is thought to occur via efficient coding. An important concept is sparseness, a measure of coding efficiency. The prevailing view is that sparseness supports efficiency by minimizing redundancy and correlations in spiking populations. Yet, we recently reported that “choristers”, neurons that behave more similarly (have correlated stimulus preferences and spontaneous coincident spiking), carry more generalizable object information than uncorrelated neurons (“soloists”) in macaque inferior temporal (IT) cortex. The rarity of choristers (as low as 6% of IT neurons) indicates that they were likely missed in previous studies. Here, we report that correlation strength is distinct from sparseness (choristers are not simply broadly tuned neurons), that choristers are located in non-granular output layers, and that correlated activity predicts human visual search efficiency. These counterintuitive results suggest that a redundant correlational structure supports efficient processing and behavior. PMID:25610392
NASA Astrophysics Data System (ADS)
Corona, Enrique; Nutter, Brian; Mitra, Sunanda; Guo, Jiangling; Karp, Tanja
2008-03-01
Efficient retrieval of high quality Regions-Of-Interest (ROI) from high resolution medical images is essential for reliable interpretation and accurate diagnosis. Random access to high quality ROI from codestreams is becoming an essential feature in many still image compression applications, particularly in viewing diseased areas from large medical images. This feature is easier to implement in block based codecs because of the inherent spatial independency of the code blocks. This independency implies that the decoding order of the blocks is unimportant as long as the position for each is properly identified. In contrast, wavelet-tree based codecs naturally use some interdependency that exploits the decaying spectrum model of the wavelet coefficients. Thus one must keep track of the decoding order from level to level with such codecs. We have developed an innovative multi-rate image subband coding scheme using "Backward Coding of Wavelet Trees (BCWT)" which is fast, memory efficient, and resolution scalable. It offers far less complexity than many other existing codecs including both, wavelet-tree, and block based algorithms. The ROI feature in BCWT is implemented through a transcoder stage that generates a new BCWT codestream containing only the information associated with the user-defined ROI. This paper presents an efficient technique that locates a particular ROI within the BCWT coded domain, and decodes it back to the spatial domain. This technique allows better access and proper identification of pathologies in high resolution images since only a small fraction of the codestream is required to be transmitted and analyzed.
Fast Sparse Coding for Range Data Denoising with Sparse Ridges Constraint
Lao, Mingjie; Sang, Yongsheng; Wen, Fei; Zhai, Ruifang
2018-01-01
Light detection and ranging (LiDAR) sensors have been widely deployed on intelligent systems such as unmanned ground vehicles (UGVs) and unmanned aerial vehicles (UAVs) to perform localization, obstacle detection, and navigation tasks. Thus, research into range data processing with competitive performance in terms of both accuracy and efficiency has attracted increasing attention. Sparse coding has revolutionized signal processing and led to state-of-the-art performance in a variety of applications. However, dictionary learning, which plays the central role in sparse coding techniques, is computationally demanding, resulting in its limited applicability in real-time systems. In this study, we propose sparse coding algorithms with a fixed pre-learned ridge dictionary to realize range data denoising via leveraging the regularity of laser range measurements in man-made environments. Experiments on both synthesized data and real data demonstrate that our method obtains accuracy comparable to that of sophisticated sparse coding methods, but with much higher computational efficiency. PMID:29734793
NASA Astrophysics Data System (ADS)
Liu, Deyang; An, Ping; Ma, Ran; Yang, Chao; Shen, Liquan; Li, Kai
2016-07-01
Three-dimensional (3-D) holoscopic imaging, also known as integral imaging, light field imaging, or plenoptic imaging, can provide natural and fatigue-free 3-D visualization. However, a large amount of data is required to represent the 3-D holoscopic content. Therefore, efficient coding schemes for this particular type of image are needed. A 3-D holoscopic image coding scheme with kernel-based minimum mean square error (MMSE) estimation is proposed. In the proposed scheme, the coding block is predicted by an MMSE estimator under statistical modeling. In order to obtain the signal statistical behavior, kernel density estimation (KDE) is utilized to estimate the probability density function of the statistical modeling. As bandwidth estimation (BE) is a key issue in the KDE problem, we also propose a BE method based on kernel trick. The experimental results demonstrate that the proposed scheme can achieve a better rate-distortion performance and a better visual rendering quality.
ANNarchy: a code generation approach to neural simulations on parallel hardware
Vitay, Julien; Dinkelbach, Helge Ü.; Hamker, Fred H.
2015-01-01
Many modern neural simulators focus on the simulation of networks of spiking neurons on parallel hardware. Another important framework in computational neuroscience, rate-coded neural networks, is mostly difficult or impossible to implement using these simulators. We present here the ANNarchy (Artificial Neural Networks architect) neural simulator, which allows to easily define and simulate rate-coded and spiking networks, as well as combinations of both. The interface in Python has been designed to be close to the PyNN interface, while the definition of neuron and synapse models can be specified using an equation-oriented mathematical description similar to the Brian neural simulator. This information is used to generate C++ code that will efficiently perform the simulation on the chosen parallel hardware (multi-core system or graphical processing unit). Several numerical methods are available to transform ordinary differential equations into an efficient C++code. We compare the parallel performance of the simulator to existing solutions. PMID:26283957
Satake, Honoo; Ono, Eiichiro; Murata, Jun
2013-12-04
Plant physiological, epidemiological, and food science studies have shed light on lignans as healthy diets for the reduction of the risk of lifestyle-related noncommunicable diseases and, thus, the demand for lignans has been rapidly increasing. However, the low efficiency and instability of lignan production via extraction from plant resources remain to be resolved, indicating the requirement for the development of new procedures for lignan production. The metabolic engineering of lignan-biosynthesizing plants is expected to be most promising for efficient, sustainable, and stable lignan production. This is supported by the recent verification of biosynthetic pathways of major dietary lignans and the exploration of lignan production via metabolic engineering using transiently gene-transfected or transgenic plants. The aim of this review is to present an overview of the biosynthetic pathways, biological activities, and metabolic engineering of lignans and also perspectives in metabolic engineering-based lignan production using transgenic plants for practical application.
Fundamental differences between optimization code test problems in engineering applications
NASA Technical Reports Server (NTRS)
Eason, E. D.
1984-01-01
The purpose here is to suggest that there is at least one fundamental difference between the problems used for testing optimization codes and the problems that engineers often need to solve; in particular, the level of precision that can be practically achieved in the numerical evaluation of the objective function, derivatives, and constraints. This difference affects the performance of optimization codes, as illustrated by two examples. Two classes of optimization problem were defined. Class One functions and constraints can be evaluated to a high precision that depends primarily on the word length of the computer. Class Two functions and/or constraints can only be evaluated to a moderate or a low level of precision for economic or modeling reasons, regardless of the computer word length. Optimization codes have not been adequately tested on Class Two problems. There are very few Class Two test problems in the literature, while there are literally hundreds of Class One test problems. The relative performance of two codes may be markedly different for Class One and Class Two problems. Less sophisticated direct search type codes may be less likely to be confused or to waste many function evaluations on Class Two problems. The analysis accuracy and minimization performance are related in a complex way that probably varies from code to code. On a problem where the analysis precision was varied over a range, the simple Hooke and Jeeves code was more efficient at low precision while the Powell code was more efficient at high precision.
Tran, Lee; Hanavan, Paul D.; Campbell, Latoya E.; De Filippis, Elena; Lake, Douglas F.; Coletta, Dawn K.; Roust, Lori R.; Mandarino, Lawrence J.; Carroll, Chad C.; Katsanos, Christos S.
2016-01-01
Our previous studies show reduced abundance of the β-subunit of mitochondrial H+-ATP synthase (β-F1-ATPase) in skeletal muscle of obese individuals. The β-F1-ATPase forms the catalytic core of the ATP synthase, and it is critical for ATP production in muscle. The mechanism(s) impairing β-F1-ATPase metabolism in obesity, however, are not completely understood. First, we studied total muscle protein synthesis and the translation efficiency of β-F1-ATPase in obese (BMI, 36±1 kg/m2) and lean (BMI, 22±1 kg/m2) subjects. Both total protein synthesis (0.044±0.006 vs 0.066±0.006%·h-1) and translation efficiency of β-F1-ATPase (0.0031±0.0007 vs 0.0073±0.0004) were lower in muscle from the obese subjects when compared to the lean controls (P<0.05). We then evaluated these same responses in a primary cell culture model, and tested the specific hypothesis that circulating non-esterified fatty acids (NEFA) in obesity play a role in the responses observed in humans. The findings on total protein synthesis and translation efficiency of β-F1-ATPase in primary myotubes cultured from a lean subject, and after exposure to NEFA extracted from serum of an obese subject, were similar to those obtained in humans. Among candidate microRNAs (i.e., non-coding RNAs regulating gene expression), we identified miR-127-5p in preventing the production of β-F1-ATPase. Muscle expression of miR-127-5p negatively correlated with β-F1-ATPase protein translation efficiency in humans (r = – 0.6744; P<0.01), and could be modeled in vitro by prolonged exposure of primary myotubes derived from the lean subject to NEFA extracted from the obese subject. On the other hand, locked nucleic acid inhibitor synthesized to target miR-127-5p significantly increased β-F1-ATPase translation efficiency in myotubes (0.6±0.1 vs 1.3±0.3, in control vs exposure to 50 nM inhibitor; P<0.05). Our experiments implicate circulating NEFA in obesity in suppressing muscle protein metabolism, and establish impaired β-F1-ATPase translation as an important consequence of obesity. PMID:27532680
Multi-stage decoding of multi-level modulation codes
NASA Technical Reports Server (NTRS)
Lin, Shu; Kasami, Tadao; Costello, Daniel J., Jr.
1991-01-01
Various types of multi-stage decoding for multi-level modulation codes are investigated. It is shown that if the component codes of a multi-level modulation code and types of decoding at various stages are chosen properly, high spectral efficiency and large coding gain can be achieved with reduced decoding complexity. Particularly, it is shown that the difference in performance between the suboptimum multi-stage soft-decision maximum likelihood decoding of a modulation code and the single-stage optimum soft-decision decoding of the code is very small, only a fraction of dB loss in signal to noise ratio at a bit error rate (BER) of 10(exp -6).
Country Report on Building Energy Codes in Canada
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shui, Bin; Evans, Meredydd
2009-04-06
This report is part of a series of reports on building energy efficiency codes in countries associated with the Asian Pacific Partnership (APP) - Australia, South Korea, Japan, China, India, and the United States of America . This reports gives an overview of the development of building energy codes in Canada, including national energy policies related to building energy codes, history of building energy codes, recent national projects and activities to promote building energy codes. The report also provides a review of current building energy codes (such as building envelope, HVAC, lighting, and water heating) for commercial and residential buildingsmore » in Canada.« less
Separable concatenated codes with iterative map decoding for Rician fading channels
NASA Technical Reports Server (NTRS)
Lodge, J. H.; Young, R. J.
1993-01-01
Very efficient signalling in radio channels requires the design of very powerful codes having special structure suitable for practical decoding schemes. In this paper, powerful codes are obtained by combining comparatively simple convolutional codes to form multi-tiered 'separable' convolutional codes. The decoding of these codes, using separable symbol-by-symbol maximum a posteriori (MAP) 'filters', is described. It is known that this approach yields impressive results in non-fading additive white Gaussian noise channels. Interleaving is an inherent part of the code construction, and consequently, these codes are well suited for fading channel communications. Here, simulation results for communications over Rician fading channels are presented to support this claim.
Preliminary Results from the Application of Automated Adjoint Code Generation to CFL3D
NASA Technical Reports Server (NTRS)
Carle, Alan; Fagan, Mike; Green, Lawrence L.
1998-01-01
This report describes preliminary results obtained using an automated adjoint code generator for Fortran to augment a widely-used computational fluid dynamics flow solver to compute derivatives. These preliminary results with this augmented code suggest that, even in its infancy, the automated adjoint code generator can accurately and efficiently deliver derivatives for use in transonic Euler-based aerodynamic shape optimization problems with hundreds to thousands of independent design variables.
Pastor, D; Amaya, W; García-Olcina, R; Sales, S
2007-07-01
We present a simple theoretical model of and the experimental verification for vanishing of the autocorrelation peak due to wavelength detuning on the coding-decoding process of coherent direct sequence optical code multiple access systems based on a superstructured fiber Bragg grating. Moreover, the detuning vanishing effect has been explored to take advantage of this effect and to provide an additional degree of multiplexing and/or optical code tuning.
The Plasma Simulation Code: A modern particle-in-cell code with patch-based load-balancing
NASA Astrophysics Data System (ADS)
Germaschewski, Kai; Fox, William; Abbott, Stephen; Ahmadi, Narges; Maynard, Kristofor; Wang, Liang; Ruhl, Hartmut; Bhattacharjee, Amitava
2016-08-01
This work describes the Plasma Simulation Code (PSC), an explicit, electromagnetic particle-in-cell code with support for different order particle shape functions. We review the basic components of the particle-in-cell method as well as the computational architecture of the PSC code that allows support for modular algorithms and data structure in the code. We then describe and analyze in detail a distinguishing feature of PSC: patch-based load balancing using space-filling curves which is shown to lead to major efficiency gains over unbalanced methods and a previously used simpler balancing method.
Low Density Parity Check Codes: Bandwidth Efficient Channel Coding
NASA Technical Reports Server (NTRS)
Fong, Wai; Lin, Shu; Maki, Gary; Yeh, Pen-Shu
2003-01-01
Low Density Parity Check (LDPC) Codes provide near-Shannon Capacity performance for NASA Missions. These codes have high coding rates R=0.82 and 0.875 with moderate code lengths, n=4096 and 8176. Their decoders have inherently parallel structures which allows for high-speed implementation. Two codes based on Euclidean Geometry (EG) were selected for flight ASIC implementation. These codes are cyclic and quasi-cyclic in nature and therefore have a simple encoder structure. This results in power and size benefits. These codes also have a large minimum distance as much as d,,, = 65 giving them powerful error correcting capabilities and error floors less than lo- BER. This paper will present development of the LDPC flight encoder and decoder, its applications and status.
Liu, Yanfeng; Shin, Hyun-dong; Li, Jianghua; Liu, Long
2015-02-01
Metabolic engineering facilitates the rational development of recombinant bacterial strains for metabolite overproduction. Building on enormous advances in system biology and synthetic biology, novel strategies have been established for multivariate optimization of metabolic networks in ensemble, spatial, and dynamic manners such as modular pathway engineering, compartmentalization metabolic engineering, and metabolic engineering guided by genome-scale metabolic models, in vitro reconstitution, and systems and synthetic biology. Herein, we summarize recent advances in novel metabolic engineering strategies. Combined with advancing kinetic models and synthetic biology tools, more efficient new strategies for improving cellular properties can be established and applied for industrially important biochemical production.
Project : semi-autonomous parking for enhanced safety and efficiency.
DOT National Transportation Integrated Search
2016-04-01
Index coding, a coding formulation traditionally analyzed in the theoretical computer science and : information theory communities, has received considerable attention in recent years due to its value in : wireless communications and networking probl...
Towards systems metabolic engineering of microorganisms for amino acid production.
Park, Jin Hwan; Lee, Sang Yup
2008-10-01
Microorganisms capable of efficient production of amino acids have traditionally been developed by random mutation and selection method, which might cause unwanted physiological changes in cellular metabolism. Rational genome-wide metabolic engineering based on systems and synthetic biology tools, which is termed 'systems metabolic engineering', is rising as an alternative to overcome these problems. Recently, several amino acid producers have been successfully developed by systems metabolic engineering, where the metabolic engineering procedures were performed within a systems biology framework, and entire metabolic networks, including complex regulatory circuits, were engineered in an integrated manner. Here we review the current status of systems metabolic engineering successfully applied for developing amino acid producing strains and discuss future prospects.
Saha, Jayita; Giri, Kalyan
2017-04-20
Compelling evidences anticipated the well acclamation of involvement of exogenous and endogenous polyamines (PAs) in conferring salt tolerance in plants. Intracellular PA's anabolism and catabolism should have contributed to maintain endogenous PAs homeostasis to induce stress signal networks. In this report, the evolutionary study has been conducted to reveal the phylogenetic relationship of genes encoding enzymes of the anabolic and catabolic pathway of PAs among the five plant lineages including green algae, moss, lycophyte, dicot and monocot along with their respective exon-intron structural patterns. Our results indicated that natural selection pressure had considerable influence on the ancestral PA metabolic pathway coding genes of land plants. PA metabolic genes have undergone gradual evolution by duplication and diversification process leading to subsequent structural modification through exon-intron gain and loss events to acquire specific function under environmental stress conditions. We have illuminated on the potential regulation of both the pathways by investigating the real-time expression analyses of PA metabolic pathway related enzyme coding genes at the transcriptional level in root and shoot tissues of two indica rice varieties, namely IR 36 (salt sensitive) and Nonabokra (salt-tolerant) in response to salinity in presence or absence of exogenous spermidine (Spd) treatment. Additionally, we have performed tissue specific quantification of the intracellular PAs and tried to draw probable connection between the PA metabolic pathway activation and endogenous PAs accumulation. Our results successfully enlighten the fact that how exogenous Spd in presence or absence of salt stress adjust the intracellular PA pathways to equilibrate the cellular PAs that would have been attributed to plant salt tolerance. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Earl, Christopher; Might, Matthew; Bagusetty, Abhishek
This study presents Nebo, a declarative domain-specific language embedded in C++ for discretizing partial differential equations for transport phenomena on multiple architectures. Application programmers use Nebo to write code that appears sequential but can be run in parallel, without editing the code. Currently Nebo supports single-thread execution, multi-thread execution, and many-core (GPU-based) execution. With single-thread execution, Nebo performs on par with code written by domain experts. With multi-thread execution, Nebo can linearly scale (with roughly 90% efficiency) up to 12 cores, compared to its single-thread execution. Moreover, Nebo’s many-core execution can be over 140x faster than its single-thread execution.
Earl, Christopher; Might, Matthew; Bagusetty, Abhishek; ...
2016-01-26
This study presents Nebo, a declarative domain-specific language embedded in C++ for discretizing partial differential equations for transport phenomena on multiple architectures. Application programmers use Nebo to write code that appears sequential but can be run in parallel, without editing the code. Currently Nebo supports single-thread execution, multi-thread execution, and many-core (GPU-based) execution. With single-thread execution, Nebo performs on par with code written by domain experts. With multi-thread execution, Nebo can linearly scale (with roughly 90% efficiency) up to 12 cores, compared to its single-thread execution. Moreover, Nebo’s many-core execution can be over 140x faster than its single-thread execution.
A Comparison of LBG and ADPCM Speech Compression Techniques
NASA Astrophysics Data System (ADS)
Bachu, Rajesh G.; Patel, Jignasa; Barkana, Buket D.
Speech compression is the technology of converting human speech into an efficiently encoded representation that can later be decoded to produce a close approximation of the original signal. In all speech there is a degree of predictability and speech coding techniques exploit this to reduce bit rates yet still maintain a suitable level of quality. This paper is a study and implementation of Linde-Buzo-Gray Algorithm (LBG) and Adaptive Differential Pulse Code Modulation (ADPCM) algorithms to compress speech signals. In here we implemented the methods using MATLAB 7.0. The methods we used in this study gave good results and performance in compressing the speech and listening tests showed that efficient and high quality coding is achieved.
Approximate maximum likelihood decoding of block codes
NASA Technical Reports Server (NTRS)
Greenberger, H. J.
1979-01-01
Approximate maximum likelihood decoding algorithms, based upon selecting a small set of candidate code words with the aid of the estimated probability of error of each received symbol, can give performance close to optimum with a reasonable amount of computation. By combining the best features of various algorithms and taking care to perform each step as efficiently as possible, a decoding scheme was developed which can decode codes which have better performance than those presently in use and yet not require an unreasonable amount of computation. The discussion of the details and tradeoffs of presently known efficient optimum and near optimum decoding algorithms leads, naturally, to the one which embodies the best features of all of them.
NASA Astrophysics Data System (ADS)
Liu, Mei-Feng; Zhong, Guo-Yun; He, Xiao-Hai; Qing, Lin-Bo
2016-09-01
Currently, most video resources on line are encoded in the H.264/AVC format. More fluent video transmission can be obtained if these resources are encoded in the newest international video coding standard: high efficiency video coding (HEVC). In order to improve the video transmission and storage on line, a transcoding method from H.264/AVC to HEVC is proposed. In this transcoding algorithm, the coding information of intraprediction, interprediction, and motion vector (MV) in H.264/AVC video stream are used to accelerate the coding in HEVC. It is found through experiments that the region of interprediction in HEVC overlaps that in H.264/AVC. Therefore, the intraprediction for the region in HEVC, which is interpredicted in H.264/AVC, can be skipped to reduce coding complexity. Several macroblocks in H.264/AVC are combined into one PU in HEVC when the MV difference between two of the macroblocks in H.264/AVC is lower than a threshold. This method selects only one coding unit depth and one prediction unit (PU) mode to reduce the coding complexity. An MV interpolation method of combined PU in HEVC is proposed according to the areas and distances between the center of one macroblock in H.264/AVC and that of the PU in HEVC. The predicted MV accelerates the motion estimation for HEVC coding. The simulation results show that our proposed algorithm achieves significant coding time reduction with a little loss in bitrates distortion rate, compared to the existing transcoding algorithms and normal HEVC coding.
SENR /NRPy + : Numerical relativity in singular curvilinear coordinate systems
NASA Astrophysics Data System (ADS)
Ruchlin, Ian; Etienne, Zachariah B.; Baumgarte, Thomas W.
2018-03-01
We report on a new open-source, user-friendly numerical relativity code package called SENR /NRPy + . Our code extends previous implementations of the BSSN reference-metric formulation to a much broader class of curvilinear coordinate systems, making it ideally suited to modeling physical configurations with approximate or exact symmetries. In the context of modeling black hole dynamics, it is orders of magnitude more efficient than other widely used open-source numerical relativity codes. NRPy + provides a Python-based interface in which equations are written in natural tensorial form and output at arbitrary finite difference order as highly efficient C code, putting complex tensorial equations at the scientist's fingertips without the need for an expensive software license. SENR provides the algorithmic framework that combines the C codes generated by NRPy + into a functioning numerical relativity code. We validate against two other established, state-of-the-art codes, and achieve excellent agreement. For the first time—in the context of moving puncture black hole evolutions—we demonstrate nearly exponential convergence of constraint violation and gravitational waveform errors to zero as the order of spatial finite difference derivatives is increased, while fixing the numerical grids at moderate resolution in a singular coordinate system. Such behavior outside the horizons is remarkable, as numerical errors do not converge to zero near punctures, and all points along the polar axis are coordinate singularities. The formulation addresses such coordinate singularities via cell-centered grids and a simple change of basis that analytically regularizes tensor components with respect to the coordinates. Future plans include extending this formulation to allow dynamical coordinate grids and bispherical-like distribution of points to efficiently capture orbiting compact binary dynamics.
Development and Implementation of CFD-Informed Models for the Advanced Subchannel Code CTF
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blyth, Taylor S.; Avramova, Maria
The research described in this PhD thesis contributes to the development of efficient methods for utilization of high-fidelity models and codes to inform low-fidelity models and codes in the area of nuclear reactor core thermal-hydraulics. The objective is to increase the accuracy of predictions of quantities of interests using high-fidelity CFD models while preserving the efficiency of low-fidelity subchannel core calculations. An original methodology named Physics- based Approach for High-to-Low Model Information has been further developed and tested. The overall physical phenomena and corresponding localized effects, which are introduced by the presence of spacer grids in light water reactor (LWR)more » cores, are dissected in corresponding four building basic processes, and corresponding models are informed using high-fidelity CFD codes. These models are a spacer grid-directed cross-flow model, a grid-enhanced turbulent mixing model, a heat transfer enhancement model, and a spacer grid pressure loss model. The localized CFD-models are developed and tested using the CFD code STAR-CCM+, and the corresponding global model development and testing in sub-channel formulation is performed in the thermal- hydraulic subchannel code CTF. The improved CTF simulations utilize data-files derived from CFD STAR-CCM+ simulation results covering the spacer grid design desired for inclusion in the CTF calculation. The current implementation of these models is examined and possibilities for improvement and further development are suggested. The validation experimental database is extended by including the OECD/NRC PSBT benchmark data. The outcome is an enhanced accuracy of CTF predictions while preserving the computational efficiency of a low-fidelity subchannel code.« less
An efficient system for reliably transmitting image and video data over low bit rate noisy channels
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.; Huang, Y. F.; Stevenson, Robert L.
1994-01-01
This research project is intended to develop an efficient system for reliably transmitting image and video data over low bit rate noisy channels. The basic ideas behind the proposed approach are the following: employ statistical-based image modeling to facilitate pre- and post-processing and error detection, use spare redundancy that the source compression did not remove to add robustness, and implement coded modulation to improve bandwidth efficiency and noise rejection. Over the last six months, progress has been made on various aspects of the project. Through our studies of the integrated system, a list-based iterative Trellis decoder has been developed. The decoder accepts feedback from a post-processor which can detect channel errors in the reconstructed image. The error detection is based on the Huber Markov random field image model for the compressed image. The compression scheme used here is that of JPEG (Joint Photographic Experts Group). Experiments were performed and the results are quite encouraging. The principal ideas here are extendable to other compression techniques. In addition, research was also performed on unequal error protection channel coding, subband vector quantization as a means of source coding, and post processing for reducing coding artifacts. Our studies on unequal error protection (UEP) coding for image transmission focused on examining the properties of the UEP capabilities of convolutional codes. The investigation of subband vector quantization employed a wavelet transform with special emphasis on exploiting interband redundancy. The outcome of this investigation included the development of three algorithms for subband vector quantization. The reduction of transform coding artifacts was studied with the aid of a non-Gaussian Markov random field model. This results in improved image decompression. These studies are summarized and the technical papers included in the appendices.
Development and Implementation of CFD-Informed Models for the Advanced Subchannel Code CTF
NASA Astrophysics Data System (ADS)
Blyth, Taylor S.
The research described in this PhD thesis contributes to the development of efficient methods for utilization of high-fidelity models and codes to inform low-fidelity models and codes in the area of nuclear reactor core thermal-hydraulics. The objective is to increase the accuracy of predictions of quantities of interests using high-fidelity CFD models while preserving the efficiency of low-fidelity subchannel core calculations. An original methodology named Physics-based Approach for High-to-Low Model Information has been further developed and tested. The overall physical phenomena and corresponding localized effects, which are introduced by the presence of spacer grids in light water reactor (LWR) cores, are dissected in corresponding four building basic processes, and corresponding models are informed using high-fidelity CFD codes. These models are a spacer grid-directed cross-flow model, a grid-enhanced turbulent mixing model, a heat transfer enhancement model, and a spacer grid pressure loss model. The localized CFD-models are developed and tested using the CFD code STAR-CCM+, and the corresponding global model development and testing in sub-channel formulation is performed in the thermal-hydraulic subchannel code CTF. The improved CTF simulations utilize data-files derived from CFD STAR-CCM+ simulation results covering the spacer grid design desired for inclusion in the CTF calculation. The current implementation of these models is examined and possibilities for improvement and further development are suggested. The validation experimental database is extended by including the OECD/NRC PSBT benchmark data. The outcome is an enhanced accuracy of CTF predictions while preserving the computational efficiency of a low-fidelity subchannel code.
Tkachenko, Anastasiya; Richter, Vladimir
2017-01-01
Genetic modifications of the oncolytic vaccinia virus (VV) improve selective tumor cell infection and death, as well as activation of antitumor immunity. We have engineered a double recombinant VV, coding human GM-CSF, and apoptosis-inducing protein apoptin (VV-GMCSF-Apo) for comparing with the earlier constructed double recombinant VV-GMCSF-Lact, coding another apoptosis-inducing protein, lactaptin, which activated different cell death pathways than apoptin. We showed that both these recombinant VVs more considerably activated a set of critical apoptosis markers in infected cells than the recombinant VV coding GM-CSF alone (VV-GMCSF-dGF): these were phosphatidylserine externalization, caspase-3 and caspase-7 activation, DNA fragmentation, and upregulation of proapoptotic protein BAX. However, only VV-GMCSF-Lact efficiently decreased the mitochondrial membrane potential of infected cancer cells. Investigating immunogenic cell death markers in cancer cells infected with recombinant VVs, we demonstrated that all tested recombinant VVs were efficient in calreticulin and HSP70 externalization, decrease of cellular HMGB1, and ATP secretion. The comparison of antitumor activity against advanced MDA-MB-231 tumor revealed that both recombinants VV-GMCSF-Lact and VV-GMCSF-Apo efficiently delay tumor growth. Our results demonstrate that the composition of GM-CSF and apoptosis-inducing proteins in the VV genome is very efficient tool for specific killing of cancer cells and for activation of antitumor immunity. PMID:28951871
Liang, Xili; Sun, Chao; Chen, Bosheng; Du, Kaiqian; Yu, Ting; Luang-In, Vijitra; Lu, Xingmeng; Shao, Yongqi
2018-06-01
Insects constitute the most abundant and diverse animal class and act as hosts to an extraordinary variety of symbiotic microorganisms. These microbes living inside the insects play critical roles in host biology and are also valuable bioresources. Enterococcus mundtii EMB156, isolated from the larval gut (gut pH >10) of the model organism Bombyx mori (Lepidoptera: Bombycidae), efficiently produces lactic acid, an important metabolite for industrial production of bioplastic materials. E. mundtii EMB156 grows well under alkaline conditions and stably converts various carbon sources into lactic acid, offering advantages in downstream fermentative processes. High-yield lactic acid production can be achieved by the strain EMB156 from renewable biomass substrates under alkaline pretreatments. Single-molecule real-time (SMRT) sequencing technology revealed its 3.01 Mbp whole genome sequence. A total of 2956 protein-coding sequences, 65 tRNA genes, and 6 rRNA operons were predicted in the EMB156 chromosome. Remarkable genomic features responsible for lactic acid fermentation included key enzymes involved in the pentose phosphate (PP)/glycolytic pathway, and an alpha amylase and xylose isomerase were characterized in EMB156. This genomic information coincides with the phenotype of E. mundtii EMB156, reflecting its metabolic flexibility in efficient lactate fermentation, and established a foundation for future biotechnological application. Interestingly, enzyme activities of amylase were quite stable in high-pH broths, indicating a possible mechanism for strong EMB156 growth in an alkaline environment, thereby facilitating lactic acid production. Together, these findings implied that valuable lactic acid-producing bacteria can be discovered efficiently by screening under the extremely alkaline conditions, as exemplified by gut microbial symbionts of Lepidoptera insects.
Input-output relation and energy efficiency in the neuron with different spike threshold dynamics.
Yi, Guo-Sheng; Wang, Jiang; Tsang, Kai-Ming; Wei, Xi-Le; Deng, Bin
2015-01-01
Neuron encodes and transmits information through generating sequences of output spikes, which is a high energy-consuming process. The spike is initiated when membrane depolarization reaches a threshold voltage. In many neurons, threshold is dynamic and depends on the rate of membrane depolarization (dV/dt) preceding a spike. Identifying the metabolic energy involved in neural coding and their relationship to threshold dynamic is critical to understanding neuronal function and evolution. Here, we use a modified Morris-Lecar model to investigate neuronal input-output property and energy efficiency associated with different spike threshold dynamics. We find that the neurons with dynamic threshold sensitive to dV/dt generate discontinuous frequency-current curve and type II phase response curve (PRC) through Hopf bifurcation, and weak noise could prohibit spiking when bifurcation just occurs. The threshold that is insensitive to dV/dt, instead, results in a continuous frequency-current curve, a type I PRC and a saddle-node on invariant circle bifurcation, and simultaneously weak noise cannot inhibit spiking. It is also shown that the bifurcation, frequency-current curve and PRC type associated with different threshold dynamics arise from the distinct subthreshold interactions of membrane currents. Further, we observe that the energy consumption of the neuron is related to its firing characteristics. The depolarization of spike threshold improves neuronal energy efficiency by reducing the overlap of Na(+) and K(+) currents during an action potential. The high energy efficiency is achieved at more depolarized spike threshold and high stimulus current. These results provide a fundamental biophysical connection that links spike threshold dynamics, input-output relation, energetics and spike initiation, which could contribute to uncover neural encoding mechanism.
Input-output relation and energy efficiency in the neuron with different spike threshold dynamics
Yi, Guo-Sheng; Wang, Jiang; Tsang, Kai-Ming; Wei, Xi-Le; Deng, Bin
2015-01-01
Neuron encodes and transmits information through generating sequences of output spikes, which is a high energy-consuming process. The spike is initiated when membrane depolarization reaches a threshold voltage. In many neurons, threshold is dynamic and depends on the rate of membrane depolarization (dV/dt) preceding a spike. Identifying the metabolic energy involved in neural coding and their relationship to threshold dynamic is critical to understanding neuronal function and evolution. Here, we use a modified Morris-Lecar model to investigate neuronal input-output property and energy efficiency associated with different spike threshold dynamics. We find that the neurons with dynamic threshold sensitive to dV/dt generate discontinuous frequency-current curve and type II phase response curve (PRC) through Hopf bifurcation, and weak noise could prohibit spiking when bifurcation just occurs. The threshold that is insensitive to dV/dt, instead, results in a continuous frequency-current curve, a type I PRC and a saddle-node on invariant circle bifurcation, and simultaneously weak noise cannot inhibit spiking. It is also shown that the bifurcation, frequency-current curve and PRC type associated with different threshold dynamics arise from the distinct subthreshold interactions of membrane currents. Further, we observe that the energy consumption of the neuron is related to its firing characteristics. The depolarization of spike threshold improves neuronal energy efficiency by reducing the overlap of Na+ and K+ currents during an action potential. The high energy efficiency is achieved at more depolarized spike threshold and high stimulus current. These results provide a fundamental biophysical connection that links spike threshold dynamics, input-output relation, energetics and spike initiation, which could contribute to uncover neural encoding mechanism. PMID:26074810
Francis, Brian R.
2015-01-01
Although analysis of the genetic code has allowed explanations for its evolution to be proposed, little evidence exists in biochemistry and molecular biology to offer an explanation for the origin of the genetic code. In particular, two features of biology make the origin of the genetic code difficult to understand. First, nucleic acids are highly complicated polymers requiring numerous enzymes for biosynthesis. Secondly, proteins have a simple backbone with a set of 20 different amino acid side chains synthesized by a highly complicated ribosomal process in which mRNA sequences are read in triplets. Apparently, both nucleic acid and protein syntheses have extensive evolutionary histories. Supporting these processes is a complex metabolism and at the hub of metabolism are the carboxylic acid cycles. This paper advances the hypothesis that the earliest predecessor of the nucleic acids was a β-linked polyester made from malic acid, a highly conserved metabolite in the carboxylic acid cycles. In the β-linked polyester, the side chains are carboxylic acid groups capable of forming interstrand double hydrogen bonds. Evolution of the nucleic acids involved changes to the backbone and side chain of poly(β-d-malic acid). Conversion of the side chain carboxylic acid into a carboxamide or a longer side chain bearing a carboxamide group, allowed information polymers to form amide pairs between polyester chains. Aminoacylation of the hydroxyl groups of malic acid and its derivatives with simple amino acids such as glycine and alanine allowed coupling of polyester synthesis and protein synthesis. Use of polypeptides containing glycine and l-alanine for activation of two different monomers with either glycine or l-alanine allowed simple coded autocatalytic synthesis of polyesters and polypeptides and established the first genetic code. A primitive cell capable of supporting electron transport, thioester synthesis, reduction reactions, and synthesis of polyesters and polypeptides is proposed. The cell consists of an iron-sulfide particle enclosed by tholin, a heterogeneous organic material that is produced by Miller-Urey type experiments that simulate conditions on the early Earth. As the synthesis of nucleic acids evolved from β-linked polyesters, the singlet coding system for replication evolved into a four nucleotide/four amino acid process (AMP = aspartic acid, GMP = glycine, UMP = valine, CMP = alanine) and then into the triplet ribosomal process that permitted multiple copies of protein to be synthesized independent of replication. This hypothesis reconciles the “genetics first” and “metabolism first” approaches to the origin of life and explains why there are four bases in the genetic alphabet. PMID:25679748
Optimal Codes for the Burst Erasure Channel
NASA Technical Reports Server (NTRS)
Hamkins, Jon
2010-01-01
Deep space communications over noisy channels lead to certain packets that are not decodable. These packets leave gaps, or bursts of erasures, in the data stream. Burst erasure correcting codes overcome this problem. These are forward erasure correcting codes that allow one to recover the missing gaps of data. Much of the recent work on this topic concentrated on Low-Density Parity-Check (LDPC) codes. These are more complicated to encode and decode than Single Parity Check (SPC) codes or Reed-Solomon (RS) codes, and so far have not been able to achieve the theoretical limit for burst erasure protection. A block interleaved maximum distance separable (MDS) code (e.g., an SPC or RS code) offers near-optimal burst erasure protection, in the sense that no other scheme of equal total transmission length and code rate could improve the guaranteed correctible burst erasure length by more than one symbol. The optimality does not depend on the length of the code, i.e., a short MDS code block interleaved to a given length would perform as well as a longer MDS code interleaved to the same overall length. As a result, this approach offers lower decoding complexity with better burst erasure protection compared to other recent designs for the burst erasure channel (e.g., LDPC codes). A limitation of the design is its lack of robustness to channels that have impairments other than burst erasures (e.g., additive white Gaussian noise), making its application best suited for correcting data erasures in layers above the physical layer. The efficiency of a burst erasure code is the length of its burst erasure correction capability divided by the theoretical upper limit on this length. The inefficiency is one minus the efficiency. The illustration compares the inefficiency of interleaved RS codes to Quasi-Cyclic (QC) LDPC codes, Euclidean Geometry (EG) LDPC codes, extended Irregular Repeat Accumulate (eIRA) codes, array codes, and random LDPC codes previously proposed for burst erasure protection. As can be seen, the simple interleaved RS codes have substantially lower inefficiency over a wide range of transmission lengths.
Amino acid fermentation at the origin of the genetic code
2012-01-01
There is evidence that the genetic code was established prior to the existence of proteins, when metabolism was powered by ribozymes. Also, early proto-organisms had to rely on simple anaerobic bioenergetic processes. In this work I propose that amino acid fermentation powered metabolism in the RNA world, and that this was facilitated by proto-adapters, the precursors of the tRNAs. Amino acids were used as carbon sources rather than as catalytic or structural elements. In modern bacteria, amino acid fermentation is known as the Stickland reaction. This pathway involves two amino acids: the first undergoes oxidative deamination, and the second acts as an electron acceptor through reductive deamination. This redox reaction results in two keto acids that are employed to synthesise ATP via substrate-level phosphorylation. The Stickland reaction is the basic bioenergetic pathway of some bacteria of the genus Clostridium. Two other facts support Stickland fermentation in the RNA world. First, several Stickland amino acid pairs are synthesised in abiotic amino acid synthesis. This suggests that amino acids that could be used as an energy substrate were freely available. Second, anticodons that have complementary sequences often correspond to amino acids that form Stickland pairs. The main hypothesis of this paper is that pairs of complementary proto-adapters were assigned to Stickland amino acids pairs. There are signatures of this hypothesis in the genetic code. Furthermore, it is argued that the proto-adapters formed double strands that brought amino acid pairs into proximity to facilitate their mutual redox reaction, structurally constraining the anticodon pairs that are assigned to these amino acid pairs. Significance tests which randomise the code are performed to study the extent of the variability of the energetic (ATP) yield. Random assignments can lead to a substantial yield of ATP and maintain enough variability, thus selection can act and refine the assignments into a proto-code that optimises the energetic yield. Monte Carlo simulations are performed to evaluate the establishment of these simple proto-codes, based on amino acid substitutions and codon swapping. In all cases, donor amino acids are assigned to anticodons composed of U+G, and have low redundancy (1-2 codons), whereas acceptor amino acids are assigned to the the remaining codons. These bioenergetic and structural constraints allow for a metabolic role for amino acids before their co-option as catalyst cofactors. Reviewers: this article was reviewed by Prof. William Martin, Prof. Eörs Szathmáry (nominated by Dr. Gáspár Jékely) and Dr. Ádám Kun (nominated by Dr. Sandor Pongor) PMID:22325238
Investigation of CSRZ code in FSO communication
NASA Astrophysics Data System (ADS)
Zhang, Zhike; Chang, Mingchao; Zhu, Ninghua; Liu, Yu
2018-02-01
A cost-effective carrier-suppressed return-to-zero (CSRZ) code generation scheme is proposed by employing a directly modulated laser (DML) module operated at 1.5 μm wavelength. Furthermore, the performance of CSRZ code signal in free-space optical (FSO) link transmission is studied by simulation. It is found from the results that the atmospheric turbulence can deteriorate the transmission performance. However, due to have lower average transmit power and higher spectrum efficient, CSRZ code signal can obtain better amplitude suppression ratio compared to the Non-return-to-zero (NRZ) code.
Product code optimization for determinate state LDPC decoding in robust image transmission.
Thomos, Nikolaos; Boulgouris, Nikolaos V; Strintzis, Michael G
2006-08-01
We propose a novel scheme for error-resilient image transmission. The proposed scheme employs a product coder consisting of low-density parity check (LDPC) codes and Reed-Solomon codes in order to deal effectively with bit errors. The efficiency of the proposed scheme is based on the exploitation of determinate symbols in Tanner graph decoding of LDPC codes and a novel product code optimization technique based on error estimation. Experimental evaluation demonstrates the superiority of the proposed system in comparison to recent state-of-the-art techniques for image transmission.
Hyperbolic and semi-hyperbolic surface codes for quantum storage
NASA Astrophysics Data System (ADS)
Breuckmann, Nikolas P.; Vuillot, Christophe; Campbell, Earl; Krishna, Anirudh; Terhal, Barbara M.
2017-09-01
We show how a hyperbolic surface code could be used for overhead-efficient quantum storage. We give numerical evidence for a noise threshold of 1.3 % for the \\{4,5\\}-hyperbolic surface code in a phenomenological noise model (as compared with 2.9 % for the toric code). In this code family, parity checks are of weight 4 and 5, while each qubit participates in four different parity checks. We introduce a family of semi-hyperbolic codes that interpolate between the toric code and the \\{4,5\\}-hyperbolic surface code in terms of encoding rate and threshold. We show how these hyperbolic codes outperform the toric code in terms of qubit overhead for a target logical error probability. We show how Dehn twists and lattice code surgery can be used to read and write individual qubits to this quantum storage medium.
A review of feed efficiency in swine: biology and application.
Patience, John F; Rossoni-Serão, Mariana C; Gutiérrez, Néstor A
2015-01-01
Feed efficiency represents the cumulative efficiency with which the pig utilizes dietary nutrients for maintenance, lean gain and lipid accretion. It is closely linked with energy metabolism, as the oxidation of carbon-containing components in the feed drive all metabolic processes. While much is known about nutrient utilization and tissue metabolism, blending these subjects into a discussion on feed efficiency has proven to be difficult. For example, while increasing dietary energy concentration will almost certainly increase feed efficiency, the correlation between dietary energy concentration and feed efficiency is surprisingly low. This is likely due to the plethora of non-dietary factors that impact feed efficiency, such as the environment and health as well as individual variation in maintenance requirements, body composition and body weight. Nonetheless, a deeper understanding of feed efficiency is critical at many levels. To individual farms, it impacts profitability. To the pork industry, it represents its competitive position against other protein sources. To food economists, it means less demand on global feed resources. There are environmental and other societal implications as well. Interestingly, feed efficiency is not always reported simply as a ratio of body weight gain to feed consumed. This review will explain why this arithmetic calculation, as simple as it initially seems, and as universally applied as it is in science and commerce, can often be misleading due to errors inherent in recording of both weight gain and feed intake. This review discusses the importance of feed efficiency, the manner in which it can be measured and reported, its basis in biology and approaches to its improvement. It concludes with a summary of findings and recommendations for future efforts.
NASA Technical Reports Server (NTRS)
Radhakrishnan, Krishnan; Bittker, David A.
1993-01-01
A general chemical kinetics and sensitivity analysis code for complex, homogeneous, gas-phase reactions is described. The main features of the code, LSENS, are its flexibility, efficiency and convenience in treating many different chemical reaction models. The models include static system, steady, one-dimensional, inviscid flow, shock initiated reaction, and a perfectly stirred reactor. In addition, equilibrium computations can be performed for several assigned states. An implicit numerical integration method, which works efficiently for the extremes of very fast and very slow reaction, is used for solving the 'stiff' differential equation systems that arise in chemical kinetics. For static reactions, sensitivity coefficients of all dependent variables and their temporal derivatives with respect to the initial values of dependent variables and/or the rate coefficient parameters can be computed. This paper presents descriptions of the code and its usage, and includes several illustrative example problems.
Parallelization of ARC3D with Computer-Aided Tools
NASA Technical Reports Server (NTRS)
Jin, Haoqiang; Hribar, Michelle; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
A series of efforts have been devoted to investigating methods of porting and parallelizing applications quickly and efficiently for new architectures, such as the SCSI Origin 2000 and Cray T3E. This report presents the parallelization of a CFD application, ARC3D, using the computer-aided tools, Cesspools. Steps of parallelizing this code and requirements of achieving better performance are discussed. The generated parallel version has achieved reasonably well performance, for example, having a speedup of 30 for 36 Cray T3E processors. However, this performance could not be obtained without modification of the original serial code. It is suggested that in many cases improving serial code and performing necessary code transformations are important parts for the automated parallelization process although user intervention in many of these parts are still necessary. Nevertheless, development and improvement of useful software tools, such as Cesspools, can help trim down many tedious parallelization details and improve the processing efficiency.
Coherent-state constellations and polar codes for thermal Gaussian channels
NASA Astrophysics Data System (ADS)
Lacerda, Felipe; Renes, Joseph M.; Scholz, Volkher B.
2017-06-01
Optical communication channels are ultimately quantum mechanical in nature, and we must therefore look beyond classical information theory to determine their communication capacity as well as to find efficient encoding and decoding schemes of the highest rates. Thermal channels, which arise from linear coupling of the field to a thermal environment, are of particular practical relevance; their classical capacity has been recently established, but their quantum capacity remains unknown. While the capacity sets the ultimate limit on reliable communication rates, it does not promise that such rates are achievable by practical means. Here we construct efficiently encodable codes for thermal channels which achieve the classical capacity and the so-called Gaussian coherent information for transmission of classical and quantum information, respectively. Our codes are based on combining polar codes with a discretization of the channel input into a finite "constellation" of coherent states. Encoding of classical information can be done using linear optics.
NASA Astrophysics Data System (ADS)
Boumehrez, Farouk; Brai, Radhia; Doghmane, Noureddine; Mansouri, Khaled
2018-01-01
Recently, video streaming has attracted much attention and interest due to its capability to process and transmit large data. We propose a quality of experience (QoE) model relying on high efficiency video coding (HEVC) encoder adaptation scheme, in turn based on the multiple description coding (MDC) for video streaming. The main contributions of the paper are (1) a performance evaluation of the new and emerging video coding standard HEVC/H.265, which is based on the variation of quantization parameter (QP) values depending on different video contents to deduce their influence on the sequence to be transmitted, (2) QoE support multimedia applications in wireless networks are investigated, so we inspect the packet loss impact on the QoE of transmitted video sequences, (3) HEVC encoder parameter adaptation scheme based on MDC is modeled with the encoder parameter and objective QoE model. A comparative study revealed that the proposed MDC approach is effective for improving the transmission with a peak signal-to-noise ratio (PSNR) gain of about 2 to 3 dB. Results show that a good choice of QP value can compensate for transmission channel effects and improve received video quality, although HEVC/H.265 is also sensitive to packet loss. The obtained results show the efficiency of our proposed method in terms of PSNR and mean-opinion-score.
NASA Astrophysics Data System (ADS)
Gerber, Florian; Mösinger, Kaspar; Furrer, Reinhard
2017-07-01
Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is-besides performance-the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.
FPGA-based LDPC-coded APSK for optical communication systems.
Zou, Ding; Lin, Changyu; Djordjevic, Ivan B
2017-02-20
In this paper, with the aid of mutual information and generalized mutual information (GMI) capacity analyses, it is shown that the geometrically shaped APSK that mimics an optimal Gaussian distribution with equiprobable signaling together with the corresponding gray-mapping rules can approach the Shannon limit closer than conventional quadrature amplitude modulation (QAM) at certain range of FEC overhead for both 16-APSK and 64-APSK. The field programmable gate array (FPGA) based LDPC-coded APSK emulation is conducted on block interleaver-based and bit interleaver-based systems; the results verify a significant improvement in hardware efficient bit interleaver-based systems. In bit interleaver-based emulation, the LDPC-coded 64-APSK outperforms 64-QAM, in terms of symbol signal-to-noise ratio (SNR), by 0.1 dB, 0.2 dB, and 0.3 dB at spectral efficiencies of 4.8, 4.5, and 4.2 b/s/Hz, respectively. It is found by emulation that LDPC-coded 64-APSK for spectral efficiencies of 4.8, 4.5, and 4.2 b/s/Hz is 1.6 dB, 1.7 dB, and 2.2 dB away from the GMI capacity.
A comparison of the Cray-2 performance before and after the installation of memory pseudo-banking
NASA Technical Reports Server (NTRS)
Schmickley, Ronald D.; Bailey, David H.
1987-01-01
A suite of 13 large Fortran benchmark codes were run on a Cray-2 configured with memory pseudo-banking circuits, and floating point operation rates were measured for each under a variety of system load configurations. These were compared with similar flop measurements taken on the same system before installation of the pseudo-banking. A useful memory access efficiency parameter was defined and calculated for both sets of performance rates, allowing a crude quantitative measure of the improvement in efficiency due to pseudo-banking. Programs were categorized as either highly scalar (S) or highly vectorized (V) and either memory-intensive or register-intensive, giving 4 categories: S-memory, S-register, V-memory, and V-register. Using flop rates as a simple quantifier of these 4 categories, a scatter plot of efficiency gain vs Mflops roughly illustrates the improvement in floating point processing speed due to pseudo-banking. On the Cray-2 system tested this improvement ranged from 1 percent for S-memory codes to about 12 percent for V-memory codes. No significant gains were made for V-register codes, which was to be expected.
NASA Astrophysics Data System (ADS)
Chapoutier, Nicolas; Mollier, François; Nolin, Guillaume; Culioli, Matthieu; Mace, Jean-Reynald
2017-09-01
In the context of the rising of Monte Carlo transport calculations for any kind of application, AREVA recently improved its suite of engineering tools in order to produce efficient Monte Carlo workflow. Monte Carlo codes, such as MCNP or TRIPOLI, are recognized as reference codes to deal with a large range of radiation transport problems. However the inherent drawbacks of theses codes - laboring input file creation and long computation time - contrast with the maturity of the treatment of the physical phenomena. The goals of the recent AREVA developments were to reach similar efficiency as other mature engineering sciences such as finite elements analyses (e.g. structural or fluid dynamics). Among the main objectives, the creation of a graphical user interface offering CAD tools for geometry creation and other graphical features dedicated to the radiation field (source definition, tally definition) has been reached. The computations times are drastically reduced compared to few years ago thanks to the use of massive parallel runs, and above all, the implementation of hybrid variance reduction technics. From now engineering teams are capable to deliver much more prompt support to any nuclear projects dealing with reactors or fuel cycle facilities from conceptual phase to decommissioning.
Salar-García, María J; Bernal, Vicente; Pastor, José M; Salvador, Manuel; Argandoña, Montserrat; Nieto, Joaquín J; Vargas, Carmen; Cánovas, Manuel
2017-02-08
The halophilic bacterium Chromohalobacter salexigens has been proposed as promising cell factory for the production of the compatible solutes ectoine and hydroxyectoine. This bacterium has evolved metabolic adaptations to efficiently grow under high salt concentrations by accumulating ectoines as compatible solutes. However, metabolic overflow, which is a major drawback for the efficient conversion of biological feedstocks, occurs as a result of metabolic unbalances during growth and ectoines production. Optimal production of ectoines is conditioned by the interplay of carbon and nitrogen metabolisms. In this work, we set out to determine how nitrogen supply affects the production of ectoines. Chromohalobacter salexigens was challenged to grow in media with unbalanced carbon/nitrogen ratio. In C. salexigens, overflow metabolism and ectoines production are a function of medium composition. At low ammonium conditions, the growth rate decreased importantly, up to 80%. Shifts in overflow metabolism were observed when changing the C/N ratio in the culture medium. 13 C-NMR analysis of ectoines labelling revealed a high metabolic rigidity, with almost constant flux ratios in all conditions assayed. Unbalanced C/N ratio led to pyruvate accumulation, especially upon N-limitation. Analysis of an ect - mutant demonstrated the link between metabolic overflow and ectoine biosynthesis. Under non ectoine synthesizing conditions, glucose uptake and metabolic overflow decreased importantly. Finally, in fed-batch cultures, biomass yield was affected by the feeding scheme chosen. High growth (up to 42.4 g L -1 ) and volumetric ectoine yields (up to 4.21 g L -1 ) were obtained by minimizing metabolite overflow and nutrient accumulation in high density cultures in a low nitrogen fed-batch culture. Moreover, the yield coefficient calculated for the transformation of glucose into biomass was 30% higher in fed-batch than in the batch culture, demonstrating that the metabolic efficiency of C. salexigens can be improved by careful design of culture feeding schemes. Metabolic shifts observed at low ammonium concentrations were explained by a shift in the energy required for nitrogen assimilation. Carbon-limited fed-batch cultures with reduced ammonium supply were the best conditions for cultivation of C. salexigens, supporting high density growth and maintaining high ectoines production.
Choudhary, Alpa; Modak, Arnab; Apte, Shree K.
2017-01-01
ABSTRACT The effective elimination of xenobiotic pollutants from the environment can be achieved by efficient degradation by microorganisms even in the presence of sugars or organic acids. Soil isolate Pseudomonas putida CSV86 displays a unique ability to utilize aromatic compounds prior to glucose. The draft genome and transcription analyses revealed that glucose uptake and benzoate transport and metabolism genes are clustered at the glc and ben loci, respectively, as two distinct operons. When grown on glucose plus benzoate, CSV86 displayed significantly higher expression of the ben locus in the first log phase and of the glc locus in the second log phase. Kinetics of substrate uptake and metabolism matched the transcription profiles. The inability of succinate to suppress benzoate transport and metabolism resulted in coutilization of succinate and benzoate. When challenged with succinate or benzoate, glucose-grown cells showed rapid reduction in glc locus transcription, glucose transport, and metabolic activity, with succinate being more effective at the functional level. Benzoate and succinate failed to interact with or inhibit the activities of glucose transport components or metabolic enzymes. The data suggest that succinate and benzoate suppress glucose transport and metabolism at the transcription level, enabling P. putida CSV86 to preferentially metabolize benzoate. This strain thus has the potential to be an ideal host to engineer diverse metabolic pathways for efficient bioremediation. IMPORTANCE Pseudomonas strains play an important role in carbon cycling in the environment and display a hierarchy in carbon utilization: organic acids first, followed by glucose, and aromatic substrates last. This limits their exploitation for bioremediation. This study demonstrates the substrate-dependent modulation of ben and glc operons in Pseudomonas putida CSV86, wherein benzoate suppresses glucose transport and metabolism at the transcription level, leading to preferential utilization of benzoate over glucose. Interestingly, succinate and benzoate are cometabolized. These properties are unique to this strain compared to other pseudomonads and open up avenues to unravel novel regulatory processes. Strain CSV86 can serve as an ideal host to engineer and facilitate efficient removal of recalcitrant pollutants even in the presence of simpler carbon sources. PMID:28733285
Coding tools investigation for next generation video coding based on HEVC
NASA Astrophysics Data System (ADS)
Chen, Jianle; Chen, Ying; Karczewicz, Marta; Li, Xiang; Liu, Hongbin; Zhang, Li; Zhao, Xin
2015-09-01
The new state-of-the-art video coding standard, H.265/HEVC, has been finalized in 2013 and it achieves roughly 50% bit rate saving compared to its predecessor, H.264/MPEG-4 AVC. This paper provides the evidence that there is still potential for further coding efficiency improvements. A brief overview of HEVC is firstly given in the paper. Then, our improvements on each main module of HEVC are presented. For instance, the recursive quadtree block structure is extended to support larger coding unit and transform unit. The motion information prediction scheme is improved by advanced temporal motion vector prediction, which inherits the motion information of each small block within a large block from a temporal reference picture. Cross component prediction with linear prediction model improves intra prediction and overlapped block motion compensation improves the efficiency of inter prediction. Furthermore, coding of both intra and inter prediction residual is improved by adaptive multiple transform technique. Finally, in addition to deblocking filter and SAO, adaptive loop filter is applied to further enhance the reconstructed picture quality. This paper describes above-mentioned techniques in detail and evaluates their coding performance benefits based on the common test condition during HEVC development. The simulation results show that significant performance improvement over HEVC standard can be achieved, especially for the high resolution video materials.
Development of the Brief Romantic Relationship Interaction Coding Scheme (BRRICS)
Humbad, Mikhila N.; Donnellan, M. Brent; Klump, Kelly L.; Burt, S. Alexandra
2012-01-01
Although observational studies of romantic relationships are common, many existing coding schemes require considerable amounts of time and resources to implement. The current study presents a new coding scheme, the Brief Romantic Relationship Interaction Coding Scheme (BRRICS), designed to assess various aspects of romantic relationship both quickly and efficiently. The BRRICS consists of four individual coding dimensions assessing positive and negative affect in each member of the dyad, as well as four codes assessing specific components of the dyadic interaction (i.e., positive reciprocity, demand-withdraw pattern, negative reciprocity, and overall satisfaction). Concurrent associations with measures of marital adjustment and conflict were evaluated in a sample of 118 married couples participating in the Michigan State University Twin Registry. Couples were asked to discuss common conflicts in their marriage while being videotaped. Undergraduate coders used the BRRICS to rate these interactions. The BRRICS scales were correlated in expected directions with self-reports of marital adjustment, as well as children’s perception of the severity and frequency of marital conflict. Based on these results, the BRRICS may be an efficient tool for researchers with large samples of observational data who are interested in coding global aspects of the relationship but do not have the resources to use labor intensive schemes. PMID:21875192
Verma, Shefali S; Lucas, Anastasia M; Lavage, Daniel R; Leader, Joseph B; Metpally, Raghu; Krishnamurthy, Sarathbabu; Dewey, Frederick; Borecki, Ingrid; Lopez, Alexander; Overton, John; Penn, John; Reid, Jeffrey; Pendergrass, Sarah A; Breitwieser, Gerda; Ritchie, Marylyn D
2017-01-01
A wide range of patient health data is recorded in Electronic Health Records (EHR). This data includes diagnosis, surgical procedures, clinical laboratory measurements, and medication information. Together this information reflects the patient's medical history. Many studies have efficiently used this data from the EHR to find associations that are clinically relevant, either by utilizing International Classification of Diseases, version 9 (ICD-9) codes or laboratory measurements, or by designing phenotype algorithms to extract case and control status with accuracy from the EHR. Here we developed a strategy to utilize longitudinal quantitative trait data from the EHR at Geisinger Health System focusing on outpatient metabolic and complete blood panel data as a starting point. Comprehensive Metabolic Panel (CMP) as well as Complete Blood Counts (CBC) are parts of routine care and provide a comprehensive picture from high level screening of patients' overall health and disease. We randomly split our data into two datasets to allow for discovery and replication. We first conducted a genome-wide association study (GWAS) with median values of 25 different clinical laboratory measurements to identify variants from Human Omni Express Exome beadchip data that are associated with these measurements. We identified 687 variants that associated and replicated with the tested clinical measurements at p<5×10-08. Since longitudinal data from the EHR provides a record of a patient's medical history, we utilized this information to further investigate the ICD-9 codes that might be associated with differences in variability of the measurements in the longitudinal dataset. We identified low and high variance patients by looking at changes within their individual longitudinal EHR laboratory results for each of the 25 clinical lab values (thus creating 50 groups - a high variance and a low variance for each lab variable). We then performed a PheWAS analysis with ICD-9 diagnosis codes, separately in the high variance group and the low variance group for each lab variable. We found 717 PheWAS associations that replicated at a p-value less than 0.001. Next, we evaluated the results of this study by comparing the association results between the high and low variance groups. For example, we found 39 SNPs (in multiple genes) associated with ICD-9 250.01 (Type-I diabetes) in patients with high variance of plasma glucose levels, but not in patients with low variance in plasma glucose levels. Another example is the association of 4 SNPs in UMOD with chronic kidney disease in patients with high variance for aspartate aminotransferase (discovery p-value: 8.71×10-09 and replication p-value: 2.03×10-06). In general, we see a pattern of many more statistically significant associations from patients with high variance in the quantitative lab variables, in comparison with the low variance group across all of the 25 laboratory measurements. This study is one of the first of its kind to utilize quantitative trait variance from longitudinal laboratory data to find associations among genetic variants and clinical phenotypes obtained from an EHR, integrating laboratory values and diagnosis codes to understand the genetic complexities of common diseases.
Carlson, Ross; Srienc, Friedrich
2004-04-20
We have previously shown that the metabolism for most efficient cell growth can be realized by a combination of two types of elementary modes. One mode produces biomass while the second mode generates only energy. The identity of the four most efficient biomass and energy pathway pairs changes, depending on the degree of oxygen limitation. The identification of such pathway pairs for different growth conditions offers a pathway-based explanation of maintenance energy generation. For a given growth rate, experimental aerobic glucose consumption rates can be used to estimate the contribution of each pathway type to the overall metabolic flux pattern. All metabolic fluxes are then completely determined by the stoichiometries of involved pathways defining all nutrient consumption and metabolite secretion rates. We present here equations that permit computation of network fluxes on the basis of unique pathways for the case of optimal, glucose-limited Escherichia coli growth under varying levels of oxygen stress. Predicted glucose and oxygen uptake rates and some metabolite secretion rates are in remarkable agreement with experimental observations supporting the validity of the presented approach. The entire most efficient, steady-state, metabolic rate structure is explicitly defined by the developed equations without need for additional computer simulations. The approach should be generally useful for analyzing and interpreting genomic data by predicting concise, pathway-based metabolic rate structures. Copyright 2004 Wiley Periodicals, Inc.
Systems Biology of Industrial Microorganisms
NASA Astrophysics Data System (ADS)
Papini, Marta; Salazar, Margarita; Nielsen, Jens
The field of industrial biotechnology is expanding rapidly as the chemical industry is looking towards more sustainable production of chemicals that can be used as fuels or building blocks for production of solvents and materials. In connection with the development of sustainable bioprocesses, it is a major challenge to design and develop efficient cell factories that can ensure cost efficient conversion of the raw material into the chemical of interest. This is achieved through metabolic engineering, where the metabolism of the cell factory is engineered such that there is an efficient conversion of sugars, the typical raw materials in the fermentation industry, into the desired product. However, engineering of cellular metabolism is often challenging due to the complex regulation that has evolved in connection with adaptation of the different microorganisms to their ecological niches. In order to map these regulatory structures and further de-regulate them, as well as identify ingenious metabolic engineering strategies that full-fill mass balance constraints, tools from systems biology can be applied. This involves both high-throughput analysis tools like transcriptome, proteome and metabolome analysis, as well as the use of mathematical modeling to simulate the phenotypes resulting from the different metabolic engineering strategies. It is in fact expected that systems biology may substantially improve the process of cell factory development, and we therefore propose the term Industrial Systems Biology for how systems biology will enhance the development of industrial biotechnology for sustainable chemical production.
Systems biology of industrial microorganisms.
Papini, Marta; Salazar, Margarita; Nielsen, Jens
2010-01-01
The field of industrial biotechnology is expanding rapidly as the chemical industry is looking towards more sustainable production of chemicals that can be used as fuels or building blocks for production of solvents and materials. In connection with the development of sustainable bioprocesses, it is a major challenge to design and develop efficient cell factories that can ensure cost efficient conversion of the raw material into the chemical of interest. This is achieved through metabolic engineering, where the metabolism of the cell factory is engineered such that there is an efficient conversion of sugars, the typical raw materials in the fermentation industry, into the desired product. However, engineering of cellular metabolism is often challenging due to the complex regulation that has evolved in connection with adaptation of the different microorganisms to their ecological niches. In order to map these regulatory structures and further de-regulate them, as well as identify ingenious metabolic engineering strategies that full-fill mass balance constraints, tools from systems biology can be applied. This involves both high-throughput analysis tools like transcriptome, proteome and metabolome analysis, as well as the use of mathematical modeling to simulate the phenotypes resulting from the different metabolic engineering strategies. It is in fact expected that systems biology may substantially improve the process of cell factory development, and we therefore propose the term Industrial Systems Biology for how systems biology will enhance the development of industrial biotechnology for sustainable chemical production.
Handheld laser scanner automatic registration based on random coding
NASA Astrophysics Data System (ADS)
He, Lei; Yu, Chun-ping; Wang, Li
2011-06-01
Current research on Laser Scanner often focuses mainly on the static measurement. Little use has been made of dynamic measurement, that are appropriate for more problems and situations. In particular, traditional Laser Scanner must Keep stable to scan and measure coordinate transformation parameters between different station. In order to make the scanning measurement intelligently and rapidly, in this paper ,we developed a new registration algorithm for handleheld laser scanner based on the positon of target, which realize the dynamic measurement of handheld laser scanner without any more complex work. the double camera on laser scanner can take photograph of the artificial target points to get the three-dimensional coordinates, this points is designed by random coding. And then, a set of matched points is found from control points to realize the orientation of scanner by the least-square common points transformation. After that the double camera can directly measure the laser point cloud in the surface of object and get the point cloud data in an unified coordinate system. There are three major contributions in the paper. Firstly, a laser scanner based on binocular vision is designed with double camera and one laser head. By those, the real-time orientation of laser scanner is realized and the efficiency is improved. Secondly, the coding marker is introduced to solve the data matching, a random coding method is proposed. Compared with other coding methods,the marker with this method is simple to match and can avoid the shading for the object. Finally, a recognition method of coding maker is proposed, with the use of the distance recognition, it is more efficient. The method present here can be used widely in any measurement from small to huge obiect, such as vehicle, airplane which strengthen its intelligence and efficiency. The results of experiments and theory analzing demonstrate that proposed method could realize the dynamic measurement of handheld laser scanner. Theory analysis and experiment shows the method is reasonable and efficient.
Error-correcting pairs for a public-key cryptosystem
NASA Astrophysics Data System (ADS)
Pellikaan, Ruud; Márquez-Corbella, Irene
2017-06-01
Code-based Cryptography (CBC) is a powerful and promising alternative for quantum resistant cryptography. Indeed, together with lattice-based cryptography, multivariate cryptography and hash-based cryptography are the principal available techniques for post-quantum cryptography. CBC was first introduced by McEliece where he designed one of the most efficient Public-Key encryption schemes with exceptionally strong security guarantees and other desirable properties that still resist to attacks based on Quantum Fourier Transform and Amplitude Amplification. The original proposal, which remains unbroken, was based on binary Goppa codes. Later, several families of codes have been proposed in order to reduce the key size. Some of these alternatives have already been broken. One of the main requirements of a code-based cryptosystem is having high performance t-bounded decoding algorithms which is achieved in the case the code has a t-error-correcting pair (ECP). Indeed, those McEliece schemes that use GRS codes, BCH, Goppa and algebraic geometry codes are in fact using an error-correcting pair as a secret key. That is, the security of these Public-Key Cryptosystems is not only based on the inherent intractability of bounded distance decoding but also on the assumption that it is difficult to retrieve efficiently an error-correcting pair. In this paper, the class of codes with a t-ECP is proposed for the McEliece cryptosystem. Moreover, we study the hardness of distinguishing arbitrary codes from those having a t-error correcting pair.
Christ, Jacob P; Falcone, Tommaso
2018-03-02
To characterize the impact of bariatric surgery on reproductive and metabolic features common to polycystic ovary syndrome (PCOS) and to assess the relevance of preoperative evaluations in predicting likelihood of benefit from surgery. A retrospective chart review of records from 930 women who had undergone bariatric surgery at the Cleveland Clinic Foundation from 2009 to 2014 was completed. Cases of PCOS were identified from ICD coding and healthy women with pelvic ultrasound evaluations were identified using Healthcare Common Procedure Coding System coding. Pre- and postoperative anthropometric evaluations, menstrual cyclicity, ovarian volume (OV) as well as markers of hyperandrogenism, dyslipidemia, and dysglycemia were evaluated. Forty-four women with PCOS and 65 controls were evaluated. Both PCOS and non-PCOS had significant reductions in body mass index (BMI) and markers of dyslipidemia postoperatively (p < 0.05). PCOS had significant reductions in androgen levels (p < 0.05) and percent meeting criteria for hyperandrogenism and irregular menses (p < 0.05). OV did not significantly decline in either group postoperatively. Among PCOS, independent of preoperative BMI and age, preoperative OV associated with change in hemoglobin A1c (β 95% (confidence interval) 0.202 (0.011-0.393), p = 0.04) and change in triglycerides (6.681 (1.028-12.334), p = 0.03), and preoperative free testosterone associated with change in total cholesterol (3.744 (0.906-6.583), p = 0.02) and change in non-HDL-C (3.125 (0.453-5.796), p = 0.03). Bariatric surgery improves key diagnostic features seen in women with PCOS and ovarian volume, and free testosterone may have utility in predicting likelihood of metabolic benefit from surgery.
Gaupels, Frank; Sarioglu, Hakan; Beckmann, Manfred; Hause, Bettina; Spannagl, Manuel; Draper, John; Lindermayr, Christian; Durner, Jörg
2012-01-01
In cucurbits, phloem latex exudes from cut sieve tubes of the extrafascicular phloem (EFP), serving in defense against herbivores. We analyzed inducible defense mechanisms in the EFP of pumpkin (Cucurbita maxima) after leaf damage. As an early systemic response, wounding elicited transient accumulation of jasmonates and a decrease in exudation probably due to partial sieve tube occlusion by callose. The energy status of the EFP was enhanced as indicated by increased levels of ATP, phosphate, and intermediates of the citric acid cycle. Gas chromatography coupled to mass spectrometry also revealed that sucrose transport, gluconeogenesis/glycolysis, and amino acid metabolism were up-regulated after wounding. Combining ProteoMiner technology for the enrichment of low-abundance proteins with stable isotope-coded protein labeling, we identified 51 wound-regulated phloem proteins. Two Sucrose-Nonfermenting1-related protein kinases and a 32-kD 14-3-3 protein are candidate central regulators of stress metabolism in the EFP. Other proteins, such as the Silverleaf Whitefly-Induced Protein1, Mitogen Activated Protein Kinase6, and Heat Shock Protein81, have known defensive functions. Isotope-coded protein labeling and western-blot analyses indicated that Cyclophilin18 is a reliable marker for stress responses of the EFP. As a hint toward the induction of redox signaling, we have observed delayed oxidation-triggered polymerization of the major Phloem Protein1 (PP1) and PP2, which correlated with a decline in carbonylation of PP2. In sum, wounding triggered transient sieve tube occlusion, enhanced energy metabolism, and accumulation of defense-related proteins in the pumpkin EFP. The systemic wound response was mediated by jasmonate and redox signaling. PMID:23085839
Greif, Gonzalo; Rodriguez, Matias; Alvarez-Valin, Fernando
2017-01-01
American trypanosomiasis is a chronic and endemic disease which affects millions of people. Trypanosoma cruzi, its causative agent, has a life cycle that involves complex morphological and functional transitions, as well as a variety of environmental conditions. This requires a tight regulation of gene expression, which is achieved mainly by post-transcriptional regulation. In this work we conducted an RNAseq analysis of the three major life cycle stages of T. cruzi: amastigotes, epimastigotes and trypomastigotes. This analysis allowed us to delineate specific transcriptomic profiling for each stage, and also to identify those biological processes of major relevance in each state. Stage specific expression profiling evidenced the plasticity of T. cruzi to adapt quickly to different conditions, with particular focus on membrane remodeling and metabolic shifts along the life cycle. Epimastigotes, which replicate in the gut of insect vectors, showed higher expression of genes related to energy metabolism, mainly Krebs cycle, respiratory chain and oxidative phosphorylation related genes, and anabolism related genes associated to nucleotide and steroid biosynthesis; also, a general down-regulation of surface glycoprotein coding genes was seen at this stage. Trypomastigotes, living extracellularly in the bloodstream of mammals, express a plethora of surface proteins and signaling genes involved in invasion and evasion of immune response. Amastigotes mostly express membrane transporters and genes involved in regulation of cell cycle, and also express a specific subset of surface glycoprotein coding genes. In addition, these results allowed us to improve the annotation of the Dm28c genome, identifying new ORFs and set the stage for construction of networks of co-expression, which can give clues about coded proteins of unknown functions. PMID:28286708
From Verified Models to Verifiable Code
NASA Technical Reports Server (NTRS)
Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.
2009-01-01
Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.
Spratlen, Miranda Jones; Gamble, Mary V; Grau-Perez, Maria; Kuo, Chin-Chi; Best, Lyle G; Yracheta, Joseph; Francesconi, Kevin; Goessler, Walter; Mossavar-Rahmani, Yasmin; Hall, Meghan; Umans, Jason G; Fretts, Amanda; Navas-Acien, Ana
2017-07-01
B-vitamins involved in one-carbon metabolism (OCM) can affect arsenic metabolism efficiency in highly arsenic exposed, undernourished populations. We evaluated whether dietary intake of OCM nutrients (including vitamins B2, B6, folate (B9), and B12) was associated with arsenic metabolism in a more nourished population exposed to lower arsenic than previously studied. Dietary intake of OCM nutrients and urine arsenic was evaluated in 405 participants from the Strong Heart Study. Arsenic exposure was measured as the sum of iAs, monomethylarsonate (MMA) and dimethylarsenate (DMA) in urine. Arsenic metabolism was measured as the individual percentages of each metabolite over their sum (iAs%, MMA%, DMA%). In adjusted models, increasing intake of vitamins B2 and B6 was associated with modest but significant decreases in iAs% and MMA% and increases in DMA%. A significant interaction was found between high folate and B6 with enhanced arsenic metabolism efficiency. Our findings suggest OCM nutrients may influence arsenic metabolism in populations with moderate arsenic exposure. Stronger and independent associations were observed with B2 and B6, vitamins previously understudied in relation to arsenic. Research is needed to evaluate whether targeting B-vitamin intake can serve as a strategy for the prevention of arsenic-related health effects at low-moderate arsenic exposure. Copyright © 2017 Elsevier Ltd. All rights reserved.
Dynamic metabolic modeling of heterotrophic and mixotrophic microalgal growth on fermentative wastes
Baroukh, Caroline; Turon, Violette; Bernard, Olivier
2017-01-01
Microalgae are promising microorganisms for the production of numerous molecules of interest, such as pigments, proteins or triglycerides that can be turned into biofuels. Heterotrophic or mixotrophic growth on fermentative wastes represents an interesting approach to achieving higher biomass concentrations, while reducing cost and improving the environmental footprint. Fermentative wastes generally consist of a blend of diverse molecules and it is thus crucial to understand microalgal metabolism in such conditions, where switching between substrates might occur. Metabolic modeling has proven to be an efficient tool for understanding metabolism and guiding the optimization of biomass or target molecule production. Here, we focused on the metabolism of Chlorella sorokiniana growing heterotrophically and mixotrophically on acetate and butyrate. The metabolism was represented by 172 metabolic reactions. The DRUM modeling framework with a mildly relaxed quasi-steady-state assumption was used to account for the switching between substrates and the presence of light. Nine experiments were used to calibrate the model and nine experiments for the validation. The model efficiently predicted the experimental data, including the transient behavior during heterotrophic, autotrophic, mixotrophic and diauxic growth. It shows that an accurate model of metabolism can now be constructed, even in dynamic conditions, with the presence of several carbon substrates. It also opens new perspectives for the heterotrophic and mixotrophic use of microalgae, especially for biofuel production from wastes. PMID:28582469
Throughput of Coded Optical CDMA Systems with AND Detectors
NASA Astrophysics Data System (ADS)
Memon, Kehkashan A.; Umrani, Fahim A.; Umrani, A. W.; Umrani, Naveed A.
2012-09-01
Conventional detection techniques used in optical code-division multiple access (OCDMA) systems are not optimal and result in poor bit error rate performance. This paper analyzes the coded performance of optical CDMA systems with AND detectors for enhanced throughput efficiencies and improved error rate performance. The results show that the use of AND detectors significantly improve the performance of an optical channel.
NASA Technical Reports Server (NTRS)
Lou, John; Ferraro, Robert; Farrara, John; Mechoso, Carlos
1996-01-01
An analysis is presented of several factors influencing the performance of a parallel implementation of the UCLA atmospheric general circulation model (AGCM) on massively parallel computer systems. Several modificaitons to the original parallel AGCM code aimed at improving its numerical efficiency, interprocessor communication cost, load-balance and issues affecting single-node code performance are discussed.
Performance measures for transform data coding.
NASA Technical Reports Server (NTRS)
Pearl, J.; Andrews, H. C.; Pratt, W. K.
1972-01-01
This paper develops performance criteria for evaluating transform data coding schemes under computational constraints. Computational constraints that conform with the proposed basis-restricted model give rise to suboptimal coding efficiency characterized by a rate-distortion relation R(D) similar in form to the theoretical rate-distortion function. Numerical examples of this performance measure are presented for Fourier, Walsh, Haar, and Karhunen-Loeve transforms.
Electromagnetic reprogrammable coding-metasurface holograms.
Li, Lianlin; Jun Cui, Tie; Ji, Wei; Liu, Shuo; Ding, Jun; Wan, Xiang; Bo Li, Yun; Jiang, Menghua; Qiu, Cheng-Wei; Zhang, Shuang
2017-08-04
Metasurfaces have enabled a plethora of emerging functions within an ultrathin dimension, paving way towards flat and highly integrated photonic devices. Despite the rapid progress in this area, simultaneous realization of reconfigurability, high efficiency, and full control over the phase and amplitude of scattered light is posing a great challenge. Here, we try to tackle this challenge by introducing the concept of a reprogrammable hologram based on 1-bit coding metasurfaces. The state of each unit cell of the coding metasurface can be switched between '1' and '0' by electrically controlling the loaded diodes. Our proof-of-concept experiments show that multiple desired holographic images can be realized in real time with only a single coding metasurface. The proposed reprogrammable hologram may be a key in enabling future intelligent devices with reconfigurable and programmable functionalities that may lead to advances in a variety of applications such as microscopy, display, security, data storage, and information processing.Realizing metasurfaces with reconfigurability, high efficiency, and control over phase and amplitude is a challenge. Here, Li et al. introduce a reprogrammable hologram based on a 1-bit coding metasurface, where the state of each unit cell of the coding metasurface can be switched electrically.
Analysis of PANDA Passive Containment Cooling Steady-State Tests with the Spectra Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stempniewicz, Marek M
2000-07-15
Results of post test simulation of the PANDA passive containment cooling (PCC) steady-state tests (S-series tests), performed at the PANDA facility at the Paul Scherrer Institute, Switzerland, are presented. The simulation has been performed using the computer code SPECTRA, a thermal-hydraulic code, designed specifically for analyzing containment behavior of nuclear power plants.Results of the present calculations are compared to the measurement data as well as the results obtained earlier with the codes MELCOR, TRAC-BF1, and TRACG. The calculated PCC efficiencies are somewhat lower than the measured values. Similar underestimation of PCC efficiencies had been obtained in the past, with themore » other computer codes. To explain this difference, it is postulated that condensate coming into the tubes forms a stream of liquid in one or two tubes, leaving most of the tubes unaffected. The condensate entering the water box is assumed to fall down in the form of droplets. With these assumptions, the results calculated with SPECTRA are close to the experimental data.It is concluded that the SPECTRA code is a suitable tool for analyzing containments of advanced reactors, equipped with passive containment cooling systems.« less
Zebra: An advanced PWR lattice code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, L.; Wu, H.; Zheng, Y.
2012-07-01
This paper presents an overview of an advanced PWR lattice code ZEBRA developed at NECP laboratory in Xi'an Jiaotong Univ.. The multi-group cross-section library is generated from the ENDF/B-VII library by NJOY and the 361-group SHEM structure is employed. The resonance calculation module is developed based on sub-group method. The transport solver is Auto-MOC code, which is a self-developed code based on the Method of Characteristic and the customization of AutoCAD software. The whole code is well organized in a modular software structure. Some numerical results during the validation of the code demonstrate that this code has a good precisionmore » and a high efficiency. (authors)« less
HERCULES: A Pattern Driven Code Transformation System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kartsaklis, Christos; Hernandez, Oscar R; Hsu, Chung-Hsing
2012-01-01
New parallel computers are emerging, but developing efficient scientific code for them remains difficult. A scientist must manage not only the science-domain complexity but also the performance-optimization complexity. HERCULES is a code transformation system designed to help the scientist to separate the two concerns, which improves code maintenance, and facilitates performance optimization. The system combines three technologies, code patterns, transformation scripts and compiler plugins, to provide the scientist with an environment to quickly implement code transformations that suit his needs. Unlike existing code optimization tools, HERCULES is unique in its focus on user-level accessibility. In this paper we discuss themore » design, implementation and an initial evaluation of HERCULES.« less
Exploring metabolic pathways in genome-scale networks via generating flux modes.
Rezola, A; de Figueiredo, L F; Brock, M; Pey, J; Podhorski, A; Wittmann, C; Schuster, S; Bockmayr, A; Planes, F J
2011-02-15
The reconstruction of metabolic networks at the genome scale has allowed the analysis of metabolic pathways at an unprecedented level of complexity. Elementary flux modes (EFMs) are an appropriate concept for such analysis. However, their number grows in a combinatorial fashion as the size of the metabolic network increases, which renders the application of EFMs approach to large metabolic networks difficult. Novel methods are expected to deal with such complexity. In this article, we present a novel optimization-based method for determining a minimal generating set of EFMs, i.e. a convex basis. We show that a subset of elements of this convex basis can be effectively computed even in large metabolic networks. Our method was applied to examine the structure of pathways producing lysine in Escherichia coli. We obtained a more varied and informative set of pathways in comparison with existing methods. In addition, an alternative pathway to produce lysine was identified using a detour via propionyl-CoA, which shows the predictive power of our novel approach. The source code in C++ is available upon request.
The emerging High Efficiency Video Coding standard (HEVC)
NASA Astrophysics Data System (ADS)
Raja, Gulistan; Khan, Awais
2013-12-01
High definition video (HDV) is becoming popular day by day. This paper describes the performance analysis of latest upcoming video standard known as High Efficiency Video Coding (HEVC). HEVC is designed to fulfil all the requirements for future high definition videos. In this paper, three configurations (intra only, low delay and random access) of HEVC are analyzed using various 480p, 720p and 1080p high definition test video sequences. Simulation results show the superior objective and subjective quality of HEVC.
Pattern-based integer sample motion search strategies in the context of HEVC
NASA Astrophysics Data System (ADS)
Maier, Georg; Bross, Benjamin; Grois, Dan; Marpe, Detlev; Schwarz, Heiko; Veltkamp, Remco C.; Wiegand, Thomas
2015-09-01
The H.265/MPEG-H High Efficiency Video Coding (HEVC) standard provides a significant increase in coding efficiency compared to its predecessor, the H.264/MPEG-4 Advanced Video Coding (AVC) standard, which however comes at the cost of a high computational burden for a compliant encoder. Motion estimation (ME), which is a part of the inter-picture prediction process, typically consumes a high amount of computational resources, while significantly increasing the coding efficiency. In spite of the fact that both H.265/MPEG-H HEVC and H.264/MPEG-4 AVC standards allow processing motion information on a fractional sample level, the motion search algorithms based on the integer sample level remain to be an integral part of ME. In this paper, a flexible integer sample ME framework is proposed, thereby allowing to trade off significant reduction of ME computation time versus coding efficiency penalty in terms of bit rate overhead. As a result, through extensive experimentation, an integer sample ME algorithm that provides a good trade-off is derived, incorporating a combination and optimization of known predictive, pattern-based and early termination techniques. The proposed ME framework is implemented on a basis of the HEVC Test Model (HM) reference software, further being compared to the state-of-the-art fast search algorithm, which is a native part of HM. It is observed that for high resolution sequences, the integer sample ME process can be speed-up by factors varying from 3.2 to 7.6, resulting in the bit-rate overhead of 1.5% and 0.6% for Random Access (RA) and Low Delay P (LDP) configurations, respectively. In addition, the similar speed-up is observed for sequences with mainly Computer-Generated Imagery (CGI) content while trading off the bit rate overhead of up to 5.2%.
Molecular dissection of nutrient exchange at the insect-microbial interface.
Douglas, Angela E
2014-10-01
Genome research is transforming our understanding of nutrient exchange between insects and intracellular bacteria. A key characteristic of these bacteria is their small genome size and gene content. Their fastidious and inflexible nutritional requirements are met by multiple metabolites from the insect host cell. Although the bacteria have generally retained genes coding the synthesis of nutrients required by the insect, some apparently critical genes have been lost, and compensated for by shared metabolic pathways with the insect host or supplementary bacteria with complementary metabolic capabilities. Copyright © 2014 Elsevier Inc. All rights reserved.
New Bandwidth Efficient Parallel Concatenated Coding Schemes
NASA Technical Reports Server (NTRS)
Denedetto, S.; Divsalar, D.; Montorsi, G.; Pollara, F.
1996-01-01
We propose a new solution to parallel concatenation of trellis codes with multilevel amplitude/phase modulations and a suitable iterative decoding structure. Examples are given for throughputs 2 bits/sec/Hz with 8PSK and 16QAM signal constellations.
Implementation of Premixed Equilibrium Chemistry Capability in OVERFLOW
NASA Technical Reports Server (NTRS)
Olsen, M. E.; Liu, Y.; Vinokur, M.; Olsen, T.
2003-01-01
An implementation of premixed equilibrium chemistry has been completed for the OVERFLOW code, a chimera capable, complex geometry flow code widely used to predict transonic flowfields. The implementation builds on the computational efficiency and geometric generality of the solver.
Implementation of Premixed Equilibrium Chemistry Capability in OVERFLOW
NASA Technical Reports Server (NTRS)
Olsen, Mike E.; Liu, Yen; Vinokur, M.; Olsen, Tom
2004-01-01
An implementation of premixed equilibrium chemistry has been completed for the OVERFLOW code, a chimera capable, complex geometry flow code widely used to predict transonic flowfields. The implementation builds on the computational efficiency and geometric generality of the solver.
NASA Astrophysics Data System (ADS)
Huang, Han-Xiong; Ruan, Xi-Chao; Chen, Guo-Chang; Zhou, Zu-Ying; Li, Xia; Bao, Jie; Nie, Yang-Bo; Zhong, Qi-Ping
2009-08-01
The light output function of a varphi50.8 mm × 50.8 mm BC501A scintillation detector was measured in the neutron energy region of 1 to 30 MeV by fitting the pulse height (PH) spectra for neutrons with the simulations from the NRESP code at the edge range. Using the new light output function, the neutron detection efficiency was determined with two Monte-Carlo codes, NEFF and SCINFUL. The calculated efficiency was corrected by comparing the simulated PH spectra with the measured ones. The determined efficiency was verified at the near threshold region and normalized with a Proton-Recoil-Telescope (PRT) at the 8-14 MeV energy region.
Igamberdiev, A U
1999-04-01
Biological organization is based on the coherent energy transfer allowing for macromolecules to operate with high efficiency and realize computation. Computation is executed with virtually 100% efficiency via the coherent operation of molecular machines in which low-energy recognitions trigger energy-driven non-equilibrium dynamic processes. The recognition process is of quantum mechanical nature being a non-demolition measurement. It underlies the enzymatic conversion of a substrate into the product (an elementary metabolic phenomenon); the switching via separation of the direct and reverse routes in futile cycles provides the generation and complication of metabolic networks (coherence within cycles is maintained by the supramolecular organization of enzymes); the genetic level corresponding to the appearance of digital information is based on reflective arrows (catalysts realize their own self-reproduction) and operation of hypercycles. Every metabolic cycle via reciprocal regulation of both its halves can generate rhythms and spatial structures (resulting from the temporally organized depositions from the cycles). Via coherent events which percolate from the elementary submolecular level to organismic entities, self-assembly based on the molecular complementarity is realized and the dynamic informational field operating within the metabolic network is generated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, R. W.; Petrov, Yu. V.
2013-12-03
Within the US Department of Energy/Office of Fusion Energy magnetic fusion research program, there is an important whole-plasma-modeling need for a radio-frequency/neutral-beam-injection (RF/NBI) transport-oriented finite-difference Fokker-Planck (FP) code with combined capabilities for 4D (2R2V) geometry near the fusion plasma periphery, and computationally less demanding 3D (1R2V) bounce-averaged capabilities for plasma in the core of fusion devices. Demonstration of proof-of-principle achievement of this goal has been carried out in research carried out under Phase I of the SBIR award. Two DOE-sponsored codes, the CQL3D bounce-average Fokker-Planck code in which CompX has specialized, and the COGENT 4D, plasma edge-oriented Fokker-Planck code whichmore » has been constructed by Lawrence Livermore National Laboratory and Lawrence Berkeley Laboratory scientists, where coupled. Coupling was achieved by using CQL3D calculated velocity distributions including an energetic tail resulting from NBI, as boundary conditions for the COGENT code over the two-dimensional velocity space on a spatial interface (flux) surface at a given radius near the plasma periphery. The finite-orbit-width fast ions from the CQL3D distributions penetrated into the peripheral plasma modeled by the COGENT code. This combined code demonstrates the feasibility of the proposed 3D/4D code. By combining these codes, the greatest computational efficiency is achieved subject to present modeling needs in toroidally symmetric magnetic fusion devices. The more efficient 3D code can be used in its regions of applicability, coupled to the more computationally demanding 4D code in higher collisionality edge plasma regions where that extended capability is necessary for accurate representation of the plasma. More efficient code leads to greater use and utility of the model. An ancillary aim of the project is to make the combined 3D/4D code user friendly. Achievement of full-coupling of these two Fokker-Planck codes will advance computational modeling of plasma devices important to the USDOE magnetic fusion energy program, in particular the DIII-D tokamak at General Atomics, San Diego, the NSTX spherical tokamak at Princeton, New Jersey, and the MST reversed-field-pinch Madison, Wisconsin. The validation studies of the code against the experiments will improve understanding of physics important for magnetic fusion, and will increase our design capabilities for achieving the goals of the International Tokamak Experimental Reactor (ITER) project in which the US is a participant and which seeks to demonstrate at least a factor of five in fusion power production divided by input power.« less
The Marriage of Residential Energy Codes and Rating Systems: Conflict Resolution or Just Conflict?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Zachary T.; Mendon, Vrushali V.
2014-08-21
After three decades of coexistence at a distance, model residential energy codes and residential energy rating systems have come together in the 2015 International Energy Conservation Code. At the October, 2013, International Code Council’s Public Comment Hearing, a new compliance path based on an Energy Rating Index was added to the IECC. Although not specifically named in the code, RESNET’s HERS rating system is the likely candidate Index for most jurisdictions. While HERS has been a mainstay in various beyond-code programs for many years, its direct incorporation into the most popular model energy code raises questions about the equivalence ofmore » a HERS-based compliance path and the traditional IECC performance compliance path, especially because the two approaches use different efficiency metrics, are governed by different simulation rules, and have different scopes with regard to energy impacting house features. A detailed simulation analysis of more than 15,000 house configurations reveals a very large range of HERS Index values that achieve equivalence with the IECC’s performance path. This paper summarizes the results of that analysis and evaluates those results against the specific Energy Rating Index values required by the 2015 IECC. Based on the home characteristics most likely to result in disparities between HERS-based compliance and performance path compliance, potential impacts on the compliance process, state and local adoption of the new code, energy efficiency in the next generation of homes subject to this new code, and future evolution of model code formats are discussed.« less
Ghorbaniaghdam, Atefeh; Chen, Jingkui; Henry, Olivier; Jolicoeur, Mario
2014-01-01
Monoclonal antibody producing Chinese hamster ovary (CHO) cells have been shown to undergo metabolic changes when engineered to produce high titers of recombinant proteins. In this work, we have studied the distinct metabolism of CHO cell clones harboring an efficient inducible expression system, based on the cumate gene switch, and displaying different expression levels, high and low productivities, compared to that of the parental cells from which they were derived. A kinetic model for CHO cell metabolism was further developed to include metabolic regulation. Model calibration was performed using intracellular and extracellular metabolite profiles obtained from shake flask batch cultures. Model simulations of intracellular fluxes and ratios known as biomarkers revealed significant changes correlated with clonal variation but not to the recombinant protein expression level. Metabolic flux distribution mostly differs in the reactions involving pyruvate metabolism, with an increased net flux of pyruvate into the tricarboxylic acid (TCA) cycle in the high-producer clone, either being induced or non-induced with cumate. More specifically, CHO cell metabolism in this clone was characterized by an efficient utilization of glucose and a high pyruvate dehydrogenase flux. Moreover, the high-producer clone shows a high rate of anaplerosis from pyruvate to oxaloacetate, through pyruvate carboxylase and from glutamate to α-ketoglutarate, through glutamate dehydrogenase, and a reduced rate of cataplerosis from malate to pyruvate, through malic enzyme. Indeed, the increase of flux through pyruvate carboxylase was not driven by an increased anabolic demand. It is in fact linked to an increase of the TCA cycle global flux, which allows better regulation of higher redox and more efficient metabolic states. To the best of our knowledge, this is the first time a dynamic in silico platform is proposed to analyze and compare the metabolomic behavior of different CHO clones. PMID:24632968
Nasr Esfahani, Maryam; Kusano, Miyako; Nguyen, Kien Huu; Watanabe, Yasuko; Ha, Chien Van; Saito, Kazuki; Sulieman, Saad; Herrera-Estrella, Luis; Tran, L S
2016-08-09
Low inorganic phosphate (Pi) availability is a major constraint for efficient nitrogen fixation in legumes, including chickpea. To elucidate the mechanisms involved in nodule acclimation to low Pi availability, two Mesorhizobium-chickpea associations exhibiting differential symbiotic performances, Mesorhizobium ciceri CP-31 (McCP-31)-chickpea and Mesorhizobium mediterranum SWRI9 (MmSWRI9)-chickpea, were comprehensively studied under both control and low Pi conditions. MmSWRI9-chickpea showed a lower symbiotic efficiency under low Pi availability than McCP-31-chickpea as evidenced by reduced growth parameters and down-regulation of nifD and nifK These differences can be attributed to decline in Pi level in MmSWRI9-induced nodules under low Pi stress, which coincided with up-regulation of several key Pi starvation-responsive genes, and accumulation of asparagine in nodules and the levels of identified amino acids in Pi-deficient leaves of MmSWRI9-inoculated plants exceeding the shoot nitrogen requirement during Pi starvation, indicative of nitrogen feedback inhibition. Conversely, Pi levels increased in nodules of Pi-stressed McCP-31-inoculated plants, because these plants evolved various metabolic and biochemical strategies to maintain nodular Pi homeostasis under Pi deficiency. These adaptations involve the activation of alternative pathways of carbon metabolism, enhanced production and exudation of organic acids from roots into the rhizosphere, and the ability to protect nodule metabolism against Pi deficiency-induced oxidative stress. Collectively, the adaptation of symbiotic efficiency under Pi deficiency resulted from highly coordinated processes with an extensive reprogramming of whole-plant metabolism. The findings of this study will enable us to design effective breeding and genetic engineering strategies to enhance symbiotic efficiency in legume crops.
The Sensitivity of Adverse Event Cost Estimates to Diagnostic Coding Error
Wardle, Gavin; Wodchis, Walter P; Laporte, Audrey; Anderson, Geoffrey M; Baker, Ross G
2012-01-01
Objective To examine the impact of diagnostic coding error on estimates of hospital costs attributable to adverse events. Data Sources Original and reabstracted medical records of 9,670 complex medical and surgical admissions at 11 hospital corporations in Ontario from 2002 to 2004. Patient specific costs, not including physician payments, were retrieved from the Ontario Case Costing Initiative database. Study Design Adverse events were identified among the original and reabstracted records using ICD10-CA (Canadian adaptation of ICD10) codes flagged as postadmission complications. Propensity score matching and multivariate regression analysis were used to estimate the cost of the adverse events and to determine the sensitivity of cost estimates to diagnostic coding error. Principal Findings Estimates of the cost of the adverse events ranged from $16,008 (metabolic derangement) to $30,176 (upper gastrointestinal bleeding). Coding errors caused the total cost attributable to the adverse events to be underestimated by 16 percent. The impact of coding error on adverse event cost estimates was highly variable at the organizational level. Conclusions Estimates of adverse event costs are highly sensitive to coding error. Adverse event costs may be significantly underestimated if the likelihood of error is ignored. PMID:22091908
Metagenomics reveals flavour metabolic network of cereal vinegar microbiota.
Wu, Lin-Huan; Lu, Zhen-Ming; Zhang, Xiao-Juan; Wang, Zong-Min; Yu, Yong-Jian; Shi, Jin-Song; Xu, Zheng-Hong
2017-04-01
Multispecies microbial community formed through centuries of repeated batch acetic acid fermentation (AAF) is crucial for the flavour quality of traditional vinegar produced from cereals. However, the metabolism to generate and/or formulate the essential flavours by the multispecies microbial community is hardly understood. Here we used metagenomic approach to clarify in situ metabolic network of key microbes responsible for flavour synthesis of a typical cereal vinegar, Zhenjiang aromatic vinegar, produced by solid-state fermentation. First, we identified 3 organic acids, 7 amino acids, and 20 volatiles as dominant vinegar metabolites. Second, we revealed taxonomic and functional composition of the microbiota by metagenomic shotgun sequencing. A total of 86 201 predicted protein-coding genes from 35 phyla (951 genera) were involved in Kyoto Encyclopedia of Genes and Genomes (KEGG) pathways of Metabolism (42.3%), Genetic Information Processing (28.3%), and Environmental Information Processing (10.1%). Furthermore, a metabolic network for substrate breakdown and dominant flavour formation in vinegar microbiota was constructed, and microbial distribution discrepancy in different metabolic pathways was charted. This study helps elucidating different metabolic roles of microbes during flavour formation in vinegar microbiota. Copyright © 2016 Elsevier Ltd. All rights reserved.
Probing soil C metabolism in response to temperature: results from experiments and modeling
NASA Astrophysics Data System (ADS)
Dijkstra, P.; Dalder, J.; Blankinship, J.; Selmants, P. C.; Schwartz, E.; Koch, G. W.; Hart, S.; Hungate, B. A.
2010-12-01
C use efficiency (CUE) is one of the least understood aspects of soil C cycling, has a very large effect on soil respiration and C sequestration, and decreases with elevated temperature. CUE is directly related to substrate partitioning over energy production and biosynthesis. The production of energy and metabolic precursors occurs in well-known processes such as glycolysis and Krebs cycle. We have developed a new stable isotope approach using position-specific 13C-labeled metabolic tracers to measure these fundamental metabolic processes in intact soil communities (1). We use this new approach, combined with models of soil metabolic flux patterns, to analyze the response of microbial energy production, biosynthesis, and CUE to temperature. The method consists of adding small but precise amounts of position-specific 13C -labeled metabolic tracers to parallel soil incubations, in this case 1-13C and 2,3-13C pyruvate and 1-13C and U-13C glucose. The measurement of CO2 released from the labeled tracers is used to calculate the C flux rates through various metabolic pathways. A simplified metabolic model consisting of 23 reactions is iteratively solved using results of the metabolic tracer experiments and information on microbial precursor demand under different temperatures. This new method enables direct study of fundamental aspects of microbial energy production, C use efficiency, and soil organic matter formation in response to temperature. (1) Dijkstra P, Blankinship JC, Selmants PC, Hart SC, Koch GW, Schwarz E and Hungate BA. Probing metabolic flux patterns of soil microbial communities using parallel position-specific tracer labeling. Soil Biology and Biochemistry (accepted)
Cloning and molecular evolution of the aldehyde dehydrogenase 2 gene (Aldh2) in bats (Chiroptera).
Chen, Yao; Shen, Bin; Zhang, Junpeng; Jones, Gareth; He, Guimei
2013-02-01
Old World fruit bats (Pteropodidae) and New World fruit bats (Phyllostomidae) ingest significant quantities of ethanol while foraging. Mitochondrial aldehyde dehydrogenase (ALDH2, encoded by the Aldh2 gene) plays an important role in ethanol metabolism. To test whether the Aldh2 gene has undergone adaptive evolution in frugivorous and nectarivorous bats in relation to ethanol elimination, we sequenced part of the coding region of the gene (1,143 bp, ~73 % coverage) in 14 bat species, including three Old World fruit bats and two New World fruit bats. Our results showed that the Aldh2 coding sequences are highly conserved across all bat species we examined, and no evidence of positive selection was detected in the ancestral branches leading to Old World fruit bats and New World fruit bats. Further research is needed to determine whether other genes involved in ethanol metabolism have been the targets of positive selection in frugivorous and nectarivorous bats.
Hanafy, Radwa A; Couger, M B; Baker, Kristina; Murphy, Chelsea; O'Kane, Shannon D; Budd, Connie; French, Donald P; Hoff, Wouter D; Youssef, Noha
2016-09-01
Micrococcus luteus is a predominant member of skin microbiome. We here report on the genomic analysis of Micrococcus luteus strain O'Kane that was isolated from an elevator. The partial genome assembly of Micrococcus luteus strain O'Kane is 2.5 Mb with 2256 protein-coding genes and 62 RNA genes. Genomic analysis revealed metabolic versatility with genes involved in the metabolism and transport of glucose, galactose, fructose, mannose, alanine, aspartate, asparagine, glutamate, glutamine, glycine, serine, cysteine, methionine, arginine, proline, histidine, phenylalanine, and fatty acids. Genomic comparison to other M. luteus representatives identified the potential to degrade polyhydroxybutyrates, as well as several antibiotic resistance genes absent from other genomes.
Transversal Clifford gates on folded surface codes
Moussa, Jonathan E.
2016-10-12
Surface and color codes are two forms of topological quantum error correction in two spatial dimensions with complementary properties. Surface codes have lower-depth error detection circuits and well-developed decoders to interpret and correct errors, while color codes have transversal Clifford gates and better code efficiency in the number of physical qubits needed to achieve a given code distance. A formal equivalence exists between color codes and folded surface codes, but it does not guarantee the transferability of any of these favorable properties. However, the equivalence does imply the existence of constant-depth circuit implementations of logical Clifford gates on folded surfacemore » codes. We achieve and improve this result by constructing two families of folded surface codes with transversal Clifford gates. This construction is presented generally for qudits of any dimension. Lastly, the specific application of these codes to universal quantum computation based on qubit fusion is also discussed.« less
Evaluation of the efficiency and reliability of software generated by code generators
NASA Technical Reports Server (NTRS)
Schreur, Barbara
1994-01-01
There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.
Global magnetosphere simulations using constrained-transport Hall-MHD with CWENO reconstruction
NASA Astrophysics Data System (ADS)
Lin, L.; Germaschewski, K.; Maynard, K. M.; Abbott, S.; Bhattacharjee, A.; Raeder, J.
2013-12-01
We present a new CWENO (Centrally-Weighted Essentially Non-Oscillatory) reconstruction based MHD solver for the OpenGGCM global magnetosphere code. The solver was built using libMRC, a library for creating efficient parallel PDE solvers on structured grids. The use of libMRC gives us access to its core functionality of providing an automated code generation framework which takes a user provided PDE right hand side in symbolic form to generate an efficient, computer architecture specific, parallel code. libMRC also supports block-structured adaptive mesh refinement and implicit-time stepping through integration with the PETSc library. We validate the new CWENO Hall-MHD solver against existing solvers both in standard test problems as well as in global magnetosphere simulations.
Star adaptation for two-algorithms used on serial computers
NASA Technical Reports Server (NTRS)
Howser, L. M.; Lambiotte, J. J., Jr.
1974-01-01
Two representative algorithms used on a serial computer and presently executed on the Control Data Corporation 6000 computer were adapted to execute efficiently on the Control Data STAR-100 computer. Gaussian elimination for the solution of simultaneous linear equations and the Gauss-Legendre quadrature formula for the approximation of an integral are the two algorithms discussed. A description is given of how the programs were adapted for STAR and why these adaptations were necessary to obtain an efficient STAR program. Some points to consider when adapting an algorithm for STAR are discussed. Program listings of the 6000 version coded in 6000 FORTRAN, the adapted STAR version coded in 6000 FORTRAN, and the STAR version coded in STAR FORTRAN are presented in the appendices.
A Network Coding Based Hybrid ARQ Protocol for Underwater Acoustic Sensor Networks
Wang, Hao; Wang, Shilian; Zhang, Eryang; Zou, Jianbin
2016-01-01
Underwater Acoustic Sensor Networks (UASNs) have attracted increasing interest in recent years due to their extensive commercial and military applications. However, the harsh underwater channel causes many challenges for the design of reliable underwater data transport protocol. In this paper, we propose an energy efficient data transport protocol based on network coding and hybrid automatic repeat request (NCHARQ) to ensure reliability, efficiency and availability in UASNs. Moreover, an adaptive window length estimation algorithm is designed to optimize the throughput and energy consumption tradeoff. The algorithm can adaptively change the code rate and can be insensitive to the environment change. Extensive simulations and analysis show that NCHARQ significantly reduces energy consumption with short end-to-end delay. PMID:27618044
Development of an Automatic Differentiation Version of the FPX Rotor Code
NASA Technical Reports Server (NTRS)
Hu, Hong
1996-01-01
The ADIFOR2.0 automatic differentiator is applied to the FPX rotor code along with the grid generator GRGN3. The FPX is an eXtended Full-Potential CFD code for rotor calculations. The automatic differentiation version of the code is obtained, which provides both non-geometry and geometry sensitivity derivatives. The sensitivity derivatives via automatic differentiation are presented and compared with divided difference generated derivatives. The study shows that automatic differentiation method gives accurate derivative values in an efficient manner.
Recent update of the RPLUS2D/3D codes
NASA Technical Reports Server (NTRS)
Tsai, Y.-L. Peter
1991-01-01
The development of the RPLUS2D/3D codes is summarized. These codes utilize LU algorithms to solve chemical non-equilibrium flows in a body-fitted coordinate system. The motivation behind the development of these codes is the need to numerically predict chemical non-equilibrium flows for the National AeroSpace Plane Program. Recent improvements include vectorization method, blocking algorithms for geometric flexibility, out-of-core storage for large-size problems, and an LU-SW/UP combination for CPU-time efficiency and solution quality.
Coding for Efficient Image Transmission
NASA Technical Reports Server (NTRS)
Rice, R. F.; Lee, J. J.
1986-01-01
NASA publication second in series on data-coding techniques for noiseless channels. Techniques used even in noisy channels, provided data further processed with Reed-Solomon or other error-correcting code. Techniques discussed in context of transmission of monochrome imagery from Voyager II spacecraft but applicable to other streams of data. Objective of this type coding to "compress" data; that is, to transmit using as few bits as possible by omitting as much as possible of portion of information repeated in subsequent samples (or picture elements).
Russ, Daniel E; Ho, Kwan-Yuet; Colt, Joanne S; Armenti, Karla R; Baris, Dalsu; Chow, Wong-Ho; Davis, Faith; Johnson, Alison; Purdue, Mark P; Karagas, Margaret R; Schwartz, Kendra; Schwenn, Molly; Silverman, Debra T; Johnson, Calvin A; Friesen, Melissa C
2016-06-01
Mapping job titles to standardised occupation classification (SOC) codes is an important step in identifying occupational risk factors in epidemiological studies. Because manual coding is time-consuming and has moderate reliability, we developed an algorithm called SOCcer (Standardized Occupation Coding for Computer-assisted Epidemiologic Research) to assign SOC-2010 codes based on free-text job description components. Job title and task-based classifiers were developed by comparing job descriptions to multiple sources linking job and task descriptions to SOC codes. An industry-based classifier was developed based on the SOC prevalence within an industry. These classifiers were used in a logistic model trained using 14 983 jobs with expert-assigned SOC codes to obtain empirical weights for an algorithm that scored each SOC/job description. We assigned the highest scoring SOC code to each job. SOCcer was validated in 2 occupational data sources by comparing SOC codes obtained from SOCcer to expert assigned SOC codes and lead exposure estimates obtained by linking SOC codes to a job-exposure matrix. For 11 991 case-control study jobs, SOCcer-assigned codes agreed with 44.5% and 76.3% of manually assigned codes at the 6-digit and 2-digit level, respectively. Agreement increased with the score, providing a mechanism to identify assignments needing review. Good agreement was observed between lead estimates based on SOCcer and manual SOC assignments (κ 0.6-0.8). Poorer performance was observed for inspection job descriptions, which included abbreviations and worksite-specific terminology. Although some manual coding will remain necessary, using SOCcer may improve the efficiency of incorporating occupation into large-scale epidemiological studies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Compress compound images in H.264/MPGE-4 AVC by exploiting spatial correlation.
Lan, Cuiling; Shi, Guangming; Wu, Feng
2010-04-01
Compound images are a combination of text, graphics and natural image. They present strong anisotropic features, especially on the text and graphics parts. These anisotropic features often render conventional compression inefficient. Thus, this paper proposes a novel coding scheme from the H.264 intraframe coding. In the scheme, two new intramodes are developed to better exploit spatial correlation in compound images. The first is the residual scalar quantization (RSQ) mode, where intrapredicted residues are directly quantized and coded without transform. The second is the base colors and index map (BCIM) mode that can be viewed as an adaptive color quantization. In this mode, an image block is represented by several representative colors, referred to as base colors, and an index map to compress. Every block selects its coding mode from two new modes and the previous intramodes in H.264 by rate-distortion optimization (RDO). Experimental results show that the proposed scheme improves the coding efficiency even more than 10 dB at most bit rates for compound images and keeps a comparable efficient performance to H.264 for natural images.
A seismic data compression system using subband coding
NASA Technical Reports Server (NTRS)
Kiely, A. B.; Pollara, F.
1995-01-01
This article presents a study of seismic data compression techniques and a compression algorithm based on subband coding. The algorithm includes three stages: a decorrelation stage, a quantization stage that introduces a controlled amount of distortion to allow for high compression ratios, and a lossless entropy coding stage based on a simple but efficient arithmetic coding method. Subband coding methods are particularly suited to the decorrelation of nonstationary processes such as seismic events. Adaptivity to the nonstationary behavior of the waveform is achieved by dividing the data into separate blocks that are encoded separately with an adaptive arithmetic encoder. This is done with high efficiency due to the low overhead introduced by the arithmetic encoder in specifying its parameters. The technique could be used as a progressive transmission system, where successive refinements of the data can be requested by the user. This allows seismologists to first examine a coarse version of waveforms with minimal usage of the channel and then decide where refinements are required. Rate-distortion performance results are presented and comparisons are made with two block transform methods.
Epoch of Reionization : An Investigation of the Semi-Analytic 21CMMC Code
NASA Astrophysics Data System (ADS)
Miller, Michelle
2018-01-01
After the Big Bang the universe was filled with neutral hydrogen that began to cool and collapse into the first structures. These first stars and galaxies began to emit radiation that eventually ionized all of the neutral hydrogen in the universe. 21CMMC is a semi-numerical code that takes simulated boxes of this ionized universe from another code called 21cmFAST. Mock measurements are taken from the simulated boxes in 21cmFAST. Those measurements are thrown into 21CMMC and help us determine three major parameters of this simulated universe: virial temperature, mean free path, and ionization efficiency. My project tests the robustness of 21CMMC on universe simulations other than 21cmFAST to see whether 21CMMC can properly reconstruct early universe parameters given a mock “measurement” in the form of power spectra. We determine that while two of the three EoR parameters (Virial Temperature and Efficiency) have some reconstructability, the mean free path parameter in the code is the least robust. This requires development of the 21CMMC code.
Mosheiff, Noga; Agmon, Haggai; Moriel, Avraham; Burak, Yoram
2017-06-01
Grid cells in the entorhinal cortex encode the position of an animal in its environment with spatially periodic tuning curves with different periodicities. Recent experiments established that these cells are functionally organized in discrete modules with uniform grid spacing. Here we develop a theory for efficient coding of position, which takes into account the temporal statistics of the animal's motion. The theory predicts a sharp decrease of module population sizes with grid spacing, in agreement with the trend seen in the experimental data. We identify a simple scheme for readout of the grid cell code by neural circuitry, that can match in accuracy the optimal Bayesian decoder. This readout scheme requires persistence over different timescales, depending on the grid cell module. Thus, we propose that the brain may employ an efficient representation of position which takes advantage of the spatiotemporal statistics of the encoded variable, in similarity to the principles that govern early sensory processing.
An Energy-Efficient Compressive Image Coding for Green Internet of Things (IoT).
Li, Ran; Duan, Xiaomeng; Li, Xu; He, Wei; Li, Yanling
2018-04-17
Aimed at a low-energy consumption of Green Internet of Things (IoT), this paper presents an energy-efficient compressive image coding scheme, which provides compressive encoder and real-time decoder according to Compressive Sensing (CS) theory. The compressive encoder adaptively measures each image block based on the block-based gradient field, which models the distribution of block sparse degree, and the real-time decoder linearly reconstructs each image block through a projection matrix, which is learned by Minimum Mean Square Error (MMSE) criterion. Both the encoder and decoder have a low computational complexity, so that they only consume a small amount of energy. Experimental results show that the proposed scheme not only has a low encoding and decoding complexity when compared with traditional methods, but it also provides good objective and subjective reconstruction qualities. In particular, it presents better time-distortion performance than JPEG. Therefore, the proposed compressive image coding is a potential energy-efficient scheme for Green IoT.
Mosheiff, Noga; Agmon, Haggai; Moriel, Avraham
2017-01-01
Grid cells in the entorhinal cortex encode the position of an animal in its environment with spatially periodic tuning curves with different periodicities. Recent experiments established that these cells are functionally organized in discrete modules with uniform grid spacing. Here we develop a theory for efficient coding of position, which takes into account the temporal statistics of the animal’s motion. The theory predicts a sharp decrease of module population sizes with grid spacing, in agreement with the trend seen in the experimental data. We identify a simple scheme for readout of the grid cell code by neural circuitry, that can match in accuracy the optimal Bayesian decoder. This readout scheme requires persistence over different timescales, depending on the grid cell module. Thus, we propose that the brain may employ an efficient representation of position which takes advantage of the spatiotemporal statistics of the encoded variable, in similarity to the principles that govern early sensory processing. PMID:28628647
Tang, Shiming; Zhang, Yimeng; Li, Zhihao; Li, Ming; Liu, Fang; Jiang, Hongfei; Lee, Tai Sing
2018-04-26
One general principle of sensory information processing is that the brain must optimize efficiency by reducing the number of neurons that process the same information. The sparseness of the sensory representations in a population of neurons reflects the efficiency of the neural code. Here, we employ large-scale two-photon calcium imaging to examine the responses of a large population of neurons within the superficial layers of area V1 with single-cell resolution, while simultaneously presenting a large set of natural visual stimuli, to provide the first direct measure of the population sparseness in awake primates. The results show that only 0.5% of neurons respond strongly to any given natural image - indicating a ten-fold increase in the inferred sparseness over previous measurements. These population activities are nevertheless necessary and sufficient to discriminate visual stimuli with high accuracy, suggesting that the neural code in the primary visual cortex is both super-sparse and highly efficient. © 2018, Tang et al.
The development of efficient coding for an electronic mail system
NASA Technical Reports Server (NTRS)
Rice, R. F.
1983-01-01
Techniques for efficiently representing scanned electronic documents were investigated. Major results include the definition and preliminary performance results of a Universal System for Efficient Electronic Mail (USEEM), offering a potential order of magnitude improvement over standard facsimile techniques for representing textual material.
Early evolution of efficient enzymes and genome organization
2012-01-01
Background Cellular life with complex metabolism probably evolved during the reign of RNA, when it served as both information carrier and enzyme. Jensen proposed that enzymes of primordial cells possessed broad specificities: they were generalist. When and under what conditions could primordial metabolism run by generalist enzymes evolve to contemporary-type metabolism run by specific enzymes? Results Here we show by numerical simulation of an enzyme-catalyzed reaction chain that specialist enzymes spread after the invention of the chromosome because protocells harbouring unlinked genes maintain largely non-specific enzymes to reduce their assortment load. When genes are linked on chromosomes, high enzyme specificity evolves because it increases biomass production, also by reducing taxation by side reactions. Conclusion The constitution of the genetic system has a profound influence on the limits of metabolic efficiency. The major evolutionary transition to chromosomes is thus proven to be a prerequisite for a complex metabolism. Furthermore, the appearance of specific enzymes opens the door for the evolution of their regulation. Reviewers This article was reviewed by Sándor Pongor, Gáspár Jékely, and Rob Knight. PMID:23114029
Shifts in growth strategies reflect tradeoffs in cellular economics
Molenaar, Douwe; van Berlo, Rogier; de Ridder, Dick; Teusink, Bas
2009-01-01
The growth rate-dependent regulation of cell size, ribosomal content, and metabolic efficiency follows a common pattern in unicellular organisms: with increasing growth rates, cell size and ribosomal content increase and a shift to energetically inefficient metabolism takes place. The latter two phenomena are also observed in fast growing tumour cells and cell lines. These patterns suggest a fundamental principle of design. In biology such designs can often be understood as the result of the optimization of fitness. Here we show that in basic models of self-replicating systems these patterns are the consequence of maximizing the growth rate. Whereas most models of cellular growth consider a part of physiology, for instance only metabolism, the approach presented here integrates several subsystems to a complete self-replicating system. Such models can yield fundamentally different optimal strategies. In particular, it is shown how the shift in metabolic efficiency originates from a tradeoff between investments in enzyme synthesis and metabolic yields for alternative catabolic pathways. The models elucidate how the optimization of growth by natural selection shapes growth strategies. PMID:19888218
Snf1-related kinase improves cardiac mitochondrial efficiency and decreases mitochondrial uncoupling
Rines, Amy K.; Chang, Hsiang-Chun; Wu, Rongxue; Sato, Tatsuya; Khechaduri, Arineh; Kouzu, Hidemichi; Shapiro, Jason; Shang, Meng; Burke, Michael A.; Abdelwahid, Eltyeb; Jiang, Xinghang; Chen, Chunlei; Rawlings, Tenley A.; Lopaschuk, Gary D.; Schumacker, Paul T.; Abel, E. Dale; Ardehali, Hossein
2017-01-01
Ischaemic heart disease limits oxygen and metabolic substrate availability to the heart, resulting in tissue death. Here, we demonstrate that the AMP-activated protein kinase (AMPK)-related protein Snf1-related kinase (SNRK) decreases cardiac metabolic substrate usage and mitochondrial uncoupling, and protects against ischaemia/reperfusion. Hearts from transgenic mice overexpressing SNRK have decreased glucose and palmitate metabolism and oxygen consumption, but maintained power and function. They also exhibit decreased uncoupling protein 3 (UCP3) and mitochondrial uncoupling. Conversely, Snrk knockout mouse hearts have increased glucose and palmitate oxidation and UCP3. SNRK knockdown in cardiac cells decreases mitochondrial efficiency, which is abolished with UCP3 knockdown. We show that Tribbles homologue 3 (Trib3) binds to SNRK, and downregulates UCP3 through PPARα. Finally, SNRK is increased in cardiomyopathy patients, and SNRK reduces infarct size after ischaemia/reperfusion. SNRK also decreases cardiac cell death in a UCP3-dependent manner. Our results suggest that SNRK improves cardiac mitochondrial efficiency and ischaemic protection. PMID:28117339
NASA Technical Reports Server (NTRS)
Van Dalsem, W. R.; Steger, J. L.
1983-01-01
A new, fast, direct-inverse, finite-difference boundary-layer code has been developed and coupled with a full-potential transonic airfoil analysis code via new inviscid-viscous interaction algorithms. The resulting code has been used to calculate transonic separated flows. The results are in good agreement with Navier-Stokes calculations and experimental data. Solutions are obtained in considerably less computer time than Navier-Stokes solutions of equal resolution. Because efficient inviscid and viscous algorithms are used, it is expected this code will also compare favorably with other codes of its type as they become available.
Simultaneous chromatic dispersion and PMD compensation by using coded-OFDM and girth-10 LDPC codes.
Djordjevic, Ivan B; Xu, Lei; Wang, Ting
2008-07-07
Low-density parity-check (LDPC)-coded orthogonal frequency division multiplexing (OFDM) is studied as an efficient coded modulation scheme suitable for simultaneous chromatic dispersion and polarization mode dispersion (PMD) compensation. We show that, for aggregate rate of 10 Gb/s, accumulated dispersion over 6500 km of SMF and differential group delay of 100 ps can be simultaneously compensated with penalty within 1.5 dB (with respect to the back-to-back configuration) when training sequence based channel estimation and girth-10 LDPC codes of rate 0.8 are employed.
Overview of the H.264/AVC video coding standard
NASA Astrophysics Data System (ADS)
Luthra, Ajay; Topiwala, Pankaj N.
2003-11-01
H.264/MPEG-4 AVC is the latest coding standard jointly developed by the Video Coding Experts Group (VCEG) of ITU-T and Moving Picture Experts Group (MPEG) of ISO/IEC. It uses state of the art coding tools and provides enhanced coding efficiency for a wide range of applications including video telephony, video conferencing, TV, storage (DVD and/or hard disk based), streaming video, digital video creation, digital cinema and others. In this paper an overview of this standard is provided. Some comparisons with the existing standards, MPEG-2 and MPEG-4 Part 2, are also provided.
Finite element analysis of inviscid subsonic boattail flow
NASA Technical Reports Server (NTRS)
Chima, R. V.; Gerhart, P. M.
1981-01-01
A finite element code for analysis of inviscid subsonic flows over arbitrary nonlifting planar or axisymmetric bodies is described. The code solves a novel primitive variable formulation of the coupled irrotationality and compressible continuity equations. Results for flow over a cylinder, a sphere, and a NACA 0012 airfoil verify the code. Computed subcritical flows over an axisymmetric boattailed afterbody compare well with finite difference results and experimental data. Interative coupling with an integral turbulent boundary layer code shows strong viscous effects on the inviscid flow. Improvements in code efficiency and extensions to transonic flows are discussed.
Compliance Verification Paths for Residential and Commercial Energy Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conover, David R.; Makela, Eric J.; Fannin, Jerica D.
2011-10-10
This report looks at different ways to verify energy code compliance and to ensure that the energy efficiency goals of an adopted document are achieved. Conformity assessment is the body of work that ensures compliance, including activities that can ensure residential and commercial buildings satisfy energy codes and standards. This report identifies and discusses conformity-assessment activities and provides guidance for conducting assessments.
Considerations of MCNP Monte Carlo code to be used as a radiotherapy treatment planning tool.
Juste, B; Miro, R; Gallardo, S; Verdu, G; Santos, A
2005-01-01
The present work has simulated the photon and electron transport in a Theratron 780® (MDS Nordion)60Co radiotherapy unit, using the Monte Carlo transport code, MCNP (Monte Carlo N-Particle). This project explains mainly the different methodologies carried out to speedup calculations in order to apply this code efficiently in radiotherapy treatment planning.
Efficient computation of kinship and identity coefficients on large pedigrees.
Cheng, En; Elliott, Brendan; Ozsoyoglu, Z Meral
2009-06-01
With the rapidly expanding field of medical genetics and genetic counseling, genealogy information is becoming increasingly abundant. An important computation on pedigree data is the calculation of identity coefficients, which provide a complete description of the degree of relatedness of a pair of individuals. The areas of application of identity coefficients are numerous and diverse, from genetic counseling to disease tracking, and thus, the computation of identity coefficients merits special attention. However, the computation of identity coefficients is not done directly, but rather as the final step after computing a set of generalized kinship coefficients. In this paper, we first propose a novel Path-Counting Formula for calculating generalized kinship coefficients, which is motivated by Wright's path-counting method for computing inbreeding coefficient. We then present an efficient and scalable scheme for calculating generalized kinship coefficients on large pedigrees using NodeCodes, a special encoding scheme for expediting the evaluation of queries on pedigree graph structures. Furthermore, we propose an improved scheme using Family NodeCodes for the computation of generalized kinship coefficients, which is motivated by the significant improvement of using Family NodeCodes for inbreeding coefficient over the use of NodeCodes. We also perform experiments for evaluating the efficiency of our method, and compare it with the performance of the traditional recursive algorithm for three individuals. Experimental results demonstrate that the resulting scheme is more scalable and efficient than the traditional recursive methods for computing generalized kinship coefficients.
Zheng, Y.
2013-01-01
Temporal sound cues are essential for sound recognition, pitch, rhythm, and timbre perception, yet how auditory neurons encode such cues is subject of ongoing debate. Rate coding theories propose that temporal sound features are represented by rate tuned modulation filters. However, overwhelming evidence also suggests that precise spike timing is an essential attribute of the neural code. Here we demonstrate that single neurons in the auditory midbrain employ a proportional code in which spike-timing precision and firing reliability covary with the sound envelope cues to provide an efficient representation of the stimulus. Spike-timing precision varied systematically with the timescale and shape of the sound envelope and yet was largely independent of the sound modulation frequency, a prominent cue for pitch. In contrast, spike-count reliability was strongly affected by the modulation frequency. Spike-timing precision extends from sub-millisecond for brief transient sounds up to tens of milliseconds for sounds with slow-varying envelope. Information theoretic analysis further confirms that spike-timing precision depends strongly on the sound envelope shape, while firing reliability was strongly affected by the sound modulation frequency. Both the information efficiency and total information were limited by the firing reliability and spike-timing precision in a manner that reflected the sound structure. This result supports a temporal coding strategy in the auditory midbrain where proportional changes in spike-timing precision and firing reliability can efficiently signal shape and periodicity temporal cues. PMID:23636724
Volumetric Medical Image Coding: An Object-based, Lossy-to-lossless and Fully Scalable Approach
Danyali, Habibiollah; Mertins, Alfred
2011-01-01
In this article, an object-based, highly scalable, lossy-to-lossless 3D wavelet coding approach for volumetric medical image data (e.g., magnetic resonance (MR) and computed tomography (CT)) is proposed. The new method, called 3DOBHS-SPIHT, is based on the well-known set partitioning in the hierarchical trees (SPIHT) algorithm and supports both quality and resolution scalability. The 3D input data is grouped into groups of slices (GOS) and each GOS is encoded and decoded as a separate unit. The symmetric tree definition of the original 3DSPIHT is improved by introducing a new asymmetric tree structure. While preserving the compression efficiency, the new tree structure allows for a small size of each GOS, which not only reduces memory consumption during the encoding and decoding processes, but also facilitates more efficient random access to certain segments of slices. To achieve more compression efficiency, the algorithm only encodes the main object of interest in each 3D data set, which can have any arbitrary shape, and ignores the unnecessary background. The experimental results on some MR data sets show the good performance of the 3DOBHS-SPIHT algorithm for multi-resolution lossy-to-lossless coding. The compression efficiency, full scalability, and object-based features of the proposed approach, beside its lossy-to-lossless coding support, make it a very attractive candidate for volumetric medical image information archiving and transmission applications. PMID:22606653
A Wideband Satcom Based Avionics Network with CDMA Uplink and TDM Downlink
NASA Technical Reports Server (NTRS)
Agrawal, D.; Johnson, B. S.; Madhow, U.; Ramchandran, K.; Chun, K. S.
2000-01-01
The purpose of this paper is to describe some key technical ideas behind our vision of a future satcom based digital communication network for avionics applications The key features of our design are as follows: (a) Packetized transmission to permit efficient use of system resources for multimedia traffic; (b) A time division multiplexed (TDM) satellite downlink whose physical layer is designed to operate the satellite link at maximum power efficiency. We show how powerful turbo codes (invented originally for linear modulation) can be used with nonlinear constant envelope modulation, thus permitting the satellite amplifier to operate in a power efficient nonlinear regime; (c) A code division multiple access (CDMA) satellite uplink, which permits efficient access to the satellite from multiple asynchronous users. Closed loop power control is difficult for bursty packetized traffic, especially given the large round trip delay to the satellite. We show how adaptive interference suppression techniques can be used to deal with the ensuing near-far problem; (d) Joint source-channel coding techniques are required both at the physical and the data transport layer to optimize the end-to-end performance. We describe a novel approach to multiple description image encoding at the data transport layer in this paper.
Voltage-dependent K+ channels improve the energy efficiency of signalling in blowfly photoreceptors
2017-01-01
Voltage-dependent conductances in many spiking neurons are tuned to reduce action potential energy consumption, so improving the energy efficiency of spike coding. However, the contribution of voltage-dependent conductances to the energy efficiency of analogue coding, by graded potentials in dendrites and non-spiking neurons, remains unclear. We investigate the contribution of voltage-dependent conductances to the energy efficiency of analogue coding by modelling blowfly R1-6 photoreceptor membrane. Two voltage-dependent delayed rectifier K+ conductances (DRs) shape the membrane's voltage response and contribute to light adaptation. They make two types of energy saving. By reducing membrane resistance upon depolarization they convert the cheap, low bandwidth membrane needed in dim light to the expensive high bandwidth membrane needed in bright light. This investment of energy in bandwidth according to functional requirements can halve daily energy consumption. Second, DRs produce negative feedback that reduces membrane impedance and increases bandwidth. This negative feedback allows an active membrane with DRs to consume at least 30% less energy than a passive membrane with the same capacitance and bandwidth. Voltage-dependent conductances in other non-spiking neurons, and in dendrites, might be organized to make similar savings. PMID:28381642
Voltage-dependent K+ channels improve the energy efficiency of signalling in blowfly photoreceptors.
Heras, Francisco J H; Anderson, John; Laughlin, Simon B; Niven, Jeremy E
2017-04-01
Voltage-dependent conductances in many spiking neurons are tuned to reduce action potential energy consumption, so improving the energy efficiency of spike coding. However, the contribution of voltage-dependent conductances to the energy efficiency of analogue coding, by graded potentials in dendrites and non-spiking neurons, remains unclear. We investigate the contribution of voltage-dependent conductances to the energy efficiency of analogue coding by modelling blowfly R1-6 photoreceptor membrane. Two voltage-dependent delayed rectifier K + conductances (DRs) shape the membrane's voltage response and contribute to light adaptation. They make two types of energy saving. By reducing membrane resistance upon depolarization they convert the cheap, low bandwidth membrane needed in dim light to the expensive high bandwidth membrane needed in bright light. This investment of energy in bandwidth according to functional requirements can halve daily energy consumption. Second, DRs produce negative feedback that reduces membrane impedance and increases bandwidth. This negative feedback allows an active membrane with DRs to consume at least 30% less energy than a passive membrane with the same capacitance and bandwidth. Voltage-dependent conductances in other non-spiking neurons, and in dendrites, might be organized to make similar savings. © 2017 The Author(s).
Aerobic and Strength Training in Concomitant Metabolic Syndrome and Type 2 Diabetes
Earnest, Conrad P.; Johannsen, Neil M.; Swift, Damon L.; Gillison, Fiona B.; Mikus, Catherine R.; Lucia, Alejandro; Kramer, Kimberly; Lavie, Carl J.; Church, Timothy S.
2014-01-01
Purpose Concomitant type 2 diabetes (T2D) and metabolic syndrome exacerbates mortality risk; yet, few studies have examined the effect of combining (AER+RES) aerobic (AER) and resistance (RES) training for individuals with T2D and metabolic syndrome. Methods We examined AER, RES, and AER+RES training (9-months) commensurate with physical activity guidelines in individuals with T2D (N=262, 63% female, 44% black). Primary outcomes were change in, and prevalence of, metabolic syndrome score at follow-up (mean, 95%CI). Secondary outcomes included maximal cardiorespiratory fitness (VO2peak and estimated METs from time-to-exhaustion (TTE), and exercise efficiency calculated as the slope of the line between ventilatory threshold, respiratory compensation, and maximal fitness. General linear models and bootstrapped Spearman correlations were used to examine changes in metabolic syndrome associated with training primary and secondary outcome variables. Results We observed a significant decrease in metabolic syndrome scores (P-for-trend, 0.003) for AER (−0.59, 95%CI, −1.00, −0.21) and AER+RES (−0.79, 95%CI, −1.40, −0.35), both being significant (P < 0.02) vs. Control (0.26, 95%CI, −0.58, 0.40) and RES (−0.13, 95%CI, −1.00, 0.24). This lead to a reduction in metabolic syndrome prevalence for the AER (56% vs. 43%) and AER+RES (55% vs. 46%) groups between baseline and follow-up. The observed decrease in metabolic syndrome was mediated by significant improvements in exercise efficiency for the AER and AER+RES training groups (P<0.05), which was more strongly related to TTE (25–30%; r= −0.38; 95% CI: −0.55, −0.19) than VO2peak (5–6%; r= −0.24; 95% CI: −0.45, −0.01). Conclusion Aerobic and AER+RES training significantly improves metabolic syndrome scores and prevalence in patients with T2D. These improvements appear to be associated with improved exercise efficiency and are more strongly related to improved TTE versus VO2peak. PMID:24389523
de Porcellinis, Alice J; Klähn, Stephan; Rosgaard, Lisa; Kirsch, Rebekka; Gutekunst, Kirstin; Georg, Jens; Hess, Wolfgang R; Sakuragi, Yumiko
2016-10-01
Carbohydrate metabolism is a tightly regulated process in photosynthetic organisms. In the cyanobacterium Synechocystis sp. PCC 6803, the photomixotrophic growth protein A (PmgA) is involved in the regulation of glucose and storage carbohydrate (i.e. glycogen) metabolism, while its biochemical activity and possible factors acting downstream of PmgA are unknown. Here, a genome-wide microarray analysis of a ΔpmgA strain identified the expression of 36 protein-coding genes and 42 non-coding transcripts as significantly altered. From these, the non-coding RNA Ncr0700 was identified as the transcript most strongly reduced in abundance. Ncr0700 is widely conserved among cyanobacteria. In Synechocystis its expression is inversely correlated with light intensity. Similarly to a ΔpmgA mutant, a Δncr0700 deletion strain showed an approximately 2-fold increase in glycogen content under photoautotrophic conditions and wild-type-like growth. Moreover, its growth was arrested by 38 h after a shift to photomixotrophic conditions. Ectopic expression of Ncr0700 in Δncr0700 and ΔpmgA restored the glycogen content and photomixotrophic growth to wild-type levels. These results indicate that Ncr0700 is required for photomixotrophic growth and the regulation of glycogen accumulation, and acts downstream of PmgA. Hence Ncr0700 is renamed here as PmgR1 for photomixotrophic growth RNA 1. © The Author 2016. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Katsuragi, Shinji; Parer, Julian T; Noda, Shunichi; Onishi, Junji; Kikuchi, Hitomi; Ikeda, Tomoaki
2015-09-01
Abstracts Objective: We have reported a 7-fold reduction in newborn umbilical arterial (UA) metabolic acidemia after adoption of a rule-based 5-category color-coded fetal heart rate (FHR) management framework. We sought evidence for the relationship being causal by detailed analysis of FHR characteristics and acid-base status before and after training. Rates of UA pH and base excess (BE) were determined over a 5-year period in a single Japanese hospital, serving mainly low-risk patients, with 3907 deliveries. We compared results in the 2 years before and after a 6-month training period in the FHR management system. We used a previously published classification schema, which was linked to management guidelines. After the training period, there was an increase in the percentage of normal patterns (23%), and a decrease in variable decelerations (14%), late decelerations (8%) and prolonged decelerations (12%) in the last 60 min of labor compared to the pre-training period. There was also a significant reduction in mean UA pH and BE in the groups with decelerations after introduction of the FHR management framework. The adoption of this FHR management system was associated with a reduction of decelerations and metabolic acidemia, without a change in cesarean or vacuum delivery rates. These results suggest that the obstetrical providers were able to better select for intervention those patients destined to develop more severe acidemia, demonstrating a possible causal relationship between the management system and reduced decelerations and metabolic acidemia.
Designing an efficient LT-code with unequal error protection for image transmission
NASA Astrophysics Data System (ADS)
S. Marques, F.; Schwartz, C.; Pinho, M. S.; Finamore, W. A.
2015-10-01
The use of images from earth observation satellites is spread over different applications, such as a car navigation systems and a disaster monitoring. In general, those images are captured by on board imaging devices and must be transmitted to the Earth using a communication system. Even though a high resolution image can produce a better Quality of Service, it leads to transmitters with high bit rate which require a large bandwidth and expend a large amount of energy. Therefore, it is very important to design efficient communication systems. From communication theory, it is well known that a source encoder is crucial in an efficient system. In a remote sensing satellite image transmission, this efficiency is achieved by using an image compressor, to reduce the amount of data which must be transmitted. The Consultative Committee for Space Data Systems (CCSDS), a multinational forum for the development of communications and data system standards for space flight, establishes a recommended standard for a data compression algorithm for images from space systems. Unfortunately, in the satellite communication channel, the transmitted signal is corrupted by the presence of noise, interference signals, etc. Therefore, the receiver of a digital communication system may fail to recover the transmitted bit. Actually, a channel code can be used to reduce the effect of this failure. In 2002, the Luby Transform code (LT-code) was introduced and it was shown that it was very efficient when the binary erasure channel model was used. Since the effect of the bit recovery failure depends on the position of the bit in the compressed image stream, in the last decade many e orts have been made to develop LT-code with unequal error protection. In 2012, Arslan et al. showed improvements when LT-codes with unequal error protection were used in images compressed by SPIHT algorithm. The techniques presented by Arslan et al. can be adapted to work with the algorithm for image compression recommended by CCSDS. In fact, to design a LT-code with an unequal error protection, the bit stream produced by the algorithm recommended by CCSDS must be partitioned in M disjoint sets of bits. Using the weighted approach, the LT-code produces M different failure probabilities for each set of bits, p1, ..., pM leading to a total probability of failure, p which is an average of p1, ..., pM. In general, the parameters of the LT-code with unequal error protection is chosen using a heuristic procedure. In this work, we analyze the problem of choosing the LT-code parameters to optimize two figure of merits: (a) the probability of achieving a minimum acceptable PSNR, and (b) the mean of PSNR, given that the minimum acceptable PSNR has been achieved. Given the rate-distortion curve achieved by CCSDS recommended algorithm, this work establishes a closed form of the mean of PSNR (given that the minimum acceptable PSNR has been achieved) as a function of p1, ..., pM. The main contribution of this work is the study of a criteria to select the parameters p1, ..., pM to optimize the performance of image transmission.
Yu, Lianchun; Shen, Zhou; Wang, Chen; Yu, Yuguo
2018-01-01
Selective pressure may drive neural systems to process as much information as possible with the lowest energy cost. Recent experiment evidence revealed that the ratio between synaptic excitation and inhibition (E/I) in local cortex is generally maintained at a certain value which may influence the efficiency of energy consumption and information transmission of neural networks. To understand this issue deeply, we constructed a typical recurrent Hodgkin-Huxley network model and studied the general principles that governs the relationship among the E/I synaptic current ratio, the energy cost and total amount of information transmission. We observed in such a network that there exists an optimal E/I synaptic current ratio in the network by which the information transmission achieves the maximum with relatively low energy cost. The coding energy efficiency which is defined as the mutual information divided by the energy cost, achieved the maximum with the balanced synaptic current. Although background noise degrades information transmission and imposes an additional energy cost, we find an optimal noise intensity that yields the largest information transmission and energy efficiency at this optimal E/I synaptic transmission ratio. The maximization of energy efficiency also requires a certain part of energy cost associated with spontaneous spiking and synaptic activities. We further proved this finding with analytical solution based on the response function of bistable neurons, and demonstrated that optimal net synaptic currents are capable of maximizing both the mutual information and energy efficiency. These results revealed that the development of E/I synaptic current balance could lead a cortical network to operate at a highly efficient information transmission rate at a relatively low energy cost. The generality of neuronal models and the recurrent network configuration used here suggest that the existence of an optimal E/I cell ratio for highly efficient energy costs and information maximization is a potential principle for cortical circuit networks. Summary We conducted numerical simulations and mathematical analysis to examine the energy efficiency of neural information transmission in a recurrent network as a function of the ratio of excitatory and inhibitory synaptic connections. We obtained a general solution showing that there exists an optimal E/I synaptic ratio in a recurrent network at which the information transmission as well as the energy efficiency of this network achieves a global maximum. These results reflect general mechanisms for sensory coding processes, which may give insight into the energy efficiency of neural communication and coding. PMID:29773979
Yu, Lianchun; Shen, Zhou; Wang, Chen; Yu, Yuguo
2018-01-01
Selective pressure may drive neural systems to process as much information as possible with the lowest energy cost. Recent experiment evidence revealed that the ratio between synaptic excitation and inhibition (E/I) in local cortex is generally maintained at a certain value which may influence the efficiency of energy consumption and information transmission of neural networks. To understand this issue deeply, we constructed a typical recurrent Hodgkin-Huxley network model and studied the general principles that governs the relationship among the E/I synaptic current ratio, the energy cost and total amount of information transmission. We observed in such a network that there exists an optimal E/I synaptic current ratio in the network by which the information transmission achieves the maximum with relatively low energy cost. The coding energy efficiency which is defined as the mutual information divided by the energy cost, achieved the maximum with the balanced synaptic current. Although background noise degrades information transmission and imposes an additional energy cost, we find an optimal noise intensity that yields the largest information transmission and energy efficiency at this optimal E/I synaptic transmission ratio. The maximization of energy efficiency also requires a certain part of energy cost associated with spontaneous spiking and synaptic activities. We further proved this finding with analytical solution based on the response function of bistable neurons, and demonstrated that optimal net synaptic currents are capable of maximizing both the mutual information and energy efficiency. These results revealed that the development of E/I synaptic current balance could lead a cortical network to operate at a highly efficient information transmission rate at a relatively low energy cost. The generality of neuronal models and the recurrent network configuration used here suggest that the existence of an optimal E/I cell ratio for highly efficient energy costs and information maximization is a potential principle for cortical circuit networks. We conducted numerical simulations and mathematical analysis to examine the energy efficiency of neural information transmission in a recurrent network as a function of the ratio of excitatory and inhibitory synaptic connections. We obtained a general solution showing that there exists an optimal E/I synaptic ratio in a recurrent network at which the information transmission as well as the energy efficiency of this network achieves a global maximum. These results reflect general mechanisms for sensory coding processes, which may give insight into the energy efficiency of neural communication and coding.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lian, Jiazhang; Mishra, Shekhar; Zhao, Huimin
Metabolic engineering aims to develop efficient cell factories by rewiring cellular metabolism. As one of the most commonly used cell factories, Saccharomyces cerevisiae has been extensively engineered to produce a wide variety of products at high levels from various feedstocks. In this paper, we summarize the recent development of metabolic engineering approaches to modulate yeast metabolism with representative examples. Particularly, we highlight new tools for biosynthetic pathway optimization (i.e. combinatorial transcriptional engineering and dynamic metabolic flux control) and genome engineering (i.e. clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR associated (Cas) system based genome engineering and RNA interference assisted genome evolution)more » to advance metabolic engineering in yeast. Lastly, we also discuss the challenges and perspectives for high throughput metabolic engineering.« less
Lian, Jiazhang; Mishra, Shekhar; Zhao, Huimin
2018-04-25
Metabolic engineering aims to develop efficient cell factories by rewiring cellular metabolism. As one of the most commonly used cell factories, Saccharomyces cerevisiae has been extensively engineered to produce a wide variety of products at high levels from various feedstocks. In this paper, we summarize the recent development of metabolic engineering approaches to modulate yeast metabolism with representative examples. Particularly, we highlight new tools for biosynthetic pathway optimization (i.e. combinatorial transcriptional engineering and dynamic metabolic flux control) and genome engineering (i.e. clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR associated (Cas) system based genome engineering and RNA interference assisted genome evolution)more » to advance metabolic engineering in yeast. Lastly, we also discuss the challenges and perspectives for high throughput metabolic engineering.« less
Robertson, Benjamin D; Vadakkeveedu, Siddarth; Sawicki, Gregory S
2017-05-24
We present a novel biorobotic framework comprised of a biological muscle-tendon unit (MTU) mechanically coupled to a feedback controlled robotic environment simulation that mimics in vivo inertial/gravitational loading and mechanical assistance from a parallel elastic exoskeleton. Using this system, we applied select combinations of biological muscle activation (modulated with rate-coded direct neural stimulation) and parallel elastic assistance (applied via closed-loop mechanical environment simulation) hypothesized to mimic human behavior based on previously published modeling studies. These conditions resulted in constant system-level force-length dynamics (i.e., stiffness), reduced biological loads, increased muscle excursion, and constant muscle average positive power output-all consistent with laboratory experiments on intact humans during exoskeleton assisted hopping. Mechanical assistance led to reduced estimated metabolic cost and MTU apparent efficiency, but increased apparent efficiency for the MTU+Exo system as a whole. Findings from this study suggest that the increased natural resonant frequency of the artificially stiffened MTU+Exo system, along with invariant movement frequencies, may underlie observed limits on the benefits of exoskeleton assistance. Our novel approach demonstrates that it is possible to capture the salient features of human locomotion with exoskeleton assistance in an isolated muscle-tendon preparation, and introduces a powerful new tool for detailed, direct examination of how assistive devices affect muscle-level neuromechanics and energetics. Copyright © 2017 Elsevier Ltd. All rights reserved.
Implementation of Finite Rate Chemistry Capability in OVERFLOW
NASA Technical Reports Server (NTRS)
Olsen, M. E.; Venkateswaran, S.; Prabhu, D. K.
2004-01-01
An implementation of both finite rate and equilibrium chemistry have been completed for the OVERFLOW code, a chimera capable, complex geometry flow code widely used to predict transonic flow fields. The implementation builds on the computational efficiency and geometric generality of the solver.
Multimodal Discriminative Binary Embedding for Large-Scale Cross-Modal Retrieval.
Wang, Di; Gao, Xinbo; Wang, Xiumei; He, Lihuo; Yuan, Bo
2016-10-01
Multimodal hashing, which conducts effective and efficient nearest neighbor search across heterogeneous data on large-scale multimedia databases, has been attracting increasing interest, given the explosive growth of multimedia content on the Internet. Recent multimodal hashing research mainly aims at learning the compact binary codes to preserve semantic information given by labels. The overwhelming majority of these methods are similarity preserving approaches which approximate pairwise similarity matrix with Hamming distances between the to-be-learnt binary hash codes. However, these methods ignore the discriminative property in hash learning process, which results in hash codes from different classes undistinguished, and therefore reduces the accuracy and robustness for the nearest neighbor search. To this end, we present a novel multimodal hashing method, named multimodal discriminative binary embedding (MDBE), which focuses on learning discriminative hash codes. First, the proposed method formulates the hash function learning in terms of classification, where the binary codes generated by the learned hash functions are expected to be discriminative. And then, it exploits the label information to discover the shared structures inside heterogeneous data. Finally, the learned structures are preserved for hash codes to produce similar binary codes in the same class. Hence, the proposed MDBE can preserve both discriminability and similarity for hash codes, and will enhance retrieval accuracy. Thorough experiments on benchmark data sets demonstrate that the proposed method achieves excellent accuracy and competitive computational efficiency compared with the state-of-the-art methods for large-scale cross-modal retrieval task.
NASA Astrophysics Data System (ADS)
Tolba, Khaled Ibrahim; Morgenthal, Guido
2018-01-01
This paper presents an analysis of the scalability and efficiency of a simulation framework based on the vortex particle method. The code is applied for the numerical aerodynamic analysis of line-like structures. The numerical code runs on multicore CPU and GPU architectures using OpenCL framework. The focus of this paper is the analysis of the parallel efficiency and scalability of the method being applied to an engineering test case, specifically the aeroelastic response of a long-span bridge girder at the construction stage. The target is to assess the optimal configuration and the required computer architecture, such that it becomes feasible to efficiently utilise the method within the computational resources available for a regular engineering office. The simulations and the scalability analysis are performed on a regular gaming type computer.
Comparison of Space Shuttle Hot Gas Manifold analysis to air flow data
NASA Technical Reports Server (NTRS)
Mcconnaughey, P. K.
1988-01-01
This paper summarizes several recent analyses of the Space Shuttle Main Engine Hot Gas Manifold and compares predicted flow environments to air flow data. Codes used in these analyses include INS3D, PAGE, PHOENICS, and VAST. Both laminar (Re = 250, M = 0.30) and turbulent (Re = 1.9 million, M = 0.30) results are discussed, with the latter being compared to data for system losses, outer wall static pressures, and manifold exit Mach number profiles. Comparison of predicted results for the turbulent case to air flow data shows that the analysis using INS3D predicted system losses within 1 percent error, while the PHOENICS, PAGE, and VAST codes erred by 31, 35, and 47 percent, respectively. The INS3D, PHOENICS, and PAGE codes did a reasonable job of predicting outer wall static pressure, while the PHOENICS code predicted exit Mach number profiles with acceptable accuracy. INS3D was approximately an order of magnitude more efficient than the other codes in terms of code speed and memory requirements. In general, it is seen that complex internal flows in manifold-like geometries can be predicted with a limited degree of confidence, and further development is necessary to improve both efficiency and accuracy of codes if they are to be used as design tools for complex three-dimensional geometries.
Video coding for 3D-HEVC based on saliency information
NASA Astrophysics Data System (ADS)
Yu, Fang; An, Ping; Yang, Chao; You, Zhixiang; Shen, Liquan
2016-11-01
As an extension of High Efficiency Video Coding ( HEVC), 3D-HEVC has been widely researched under the impetus of the new generation coding standard in recent years. Compared with H.264/AVC, its compression efficiency is doubled while keeping the same video quality. However, its higher encoding complexity and longer encoding time are not negligible. To reduce the computational complexity and guarantee the subjective quality of virtual views, this paper presents a novel video coding method for 3D-HEVC based on the saliency informat ion which is an important part of Human Visual System (HVS). First of all, the relationship between the current coding unit and its adjacent units is used to adjust the maximum depth of each largest coding unit (LCU) and determine the SKIP mode reasonably. Then, according to the saliency informat ion of each frame image, the texture and its corresponding depth map will be divided into three regions, that is, salient area, middle area and non-salient area. Afterwards, d ifferent quantization parameters will be assigned to different regions to conduct low complexity coding. Finally, the compressed video will generate new view point videos through the renderer tool. As shown in our experiments, the proposed method saves more bit rate than other approaches and achieves up to highest 38% encoding time reduction without subjective quality loss in compression or rendering.
Visual search asymmetries within color-coded and intensity-coded displays.
Yamani, Yusuke; McCarley, Jason S
2010-06-01
Color and intensity coding provide perceptual cues to segregate categories of objects within a visual display, allowing operators to search more efficiently for needed information. Even within a perceptually distinct subset of display elements, however, it may often be useful to prioritize items representing urgent or task-critical information. The design of symbology to produce search asymmetries (Treisman & Souther, 1985) offers a potential technique for doing this, but it is not obvious from existing models of search that an asymmetry observed in the absence of extraneous visual stimuli will persist within a complex color- or intensity-coded display. To address this issue, in the current study we measured the strength of a visual search asymmetry within displays containing color- or intensity-coded extraneous items. The asymmetry persisted strongly in the presence of extraneous items that were drawn in a different color (Experiment 1) or a lower contrast (Experiment 2) than the search-relevant items, with the targets favored by the search asymmetry producing highly efficient search. The asymmetry was attenuated but not eliminated when extraneous items were drawn in a higher contrast than search-relevant items (Experiment 3). Results imply that the coding of symbology to exploit visual search asymmetries can facilitate visual search for high-priority items even within color- or intensity-coded displays. PsycINFO Database Record (c) 2010 APA, all rights reserved.
Lin, Xue; Yu, Ai-Qun; Zhang, Cui-Ying; Pi, Li; Bai, Xiao-Wen; Xiao, Dong-Guang
2017-11-09
Tup1 is a general transcriptional repressor of diverse gene families coordinately controlled by glucose repression, mating type, and other mechanisms in Saccharomyces cerevisiae. Several functional domains of Tup1 have been identified, each of which has differing effects on transcriptional repression. In this study, we aim to investigate the role of Tup1 and its domains in maltose metabolism of industrial baker's yeast. To this end, a battery of in-frame truncations in the TUP1 gene coding region were performed in the industrial baker's yeasts with different genetic background, and the maltose metabolism, leavening ability, MAL gene expression levels, and growth characteristics were investigated. The results suggest that the TUP1 gene is essential to maltose metabolism in industrial baker's yeast. Importantly, different domains of Tup1 play different roles in glucose repression and maltose metabolism of industrial baker's yeast cells. The Ssn6 interaction, N-terminal repression and C-terminal repression domains might play roles in the regulation of MAL transcription by Tup1 for maltose metabolism of baker's yeast. The WD region lacking the first repeat could influence the regulation of maltose metabolism directly, rather than indirectly through glucose repression. These findings lay a foundation for the optimization of industrial baker's yeast strains for accelerated maltose metabolism and facilitate future research on glucose repression in other sugar metabolism.
[Waist-to-height ratio is an indicator of metabolic risk in children].
Valle-Leal, Jaime; Abundis-Castro, Leticia; Hernández-Escareño, Juan; Flores-Rubio, Salvador
2016-01-01
Abdominal fat, particularly visceral, is associated with a high risk of metabolic complications. The waist-height ratio (WHtR) is used to assess abdominal fat in individuals of all ages. To determine the ability of the waist-to-height ratio to detect metabolic risk in mexican schoolchildren. A study was conducted on children between 6 and 12 years. Obesity was diagnosed as a body mass index (BMI) ≥ 85th percentile, and an ICE ≥0.5 was considered abdominal obesity. Blood levels of glucose, cholesterol and triglycerides were measured. The sensitivity, specificity, positive predictive and negative value, area under curve, the positive likelihood ratio and negative likelihood ratio of the WHtR and BMI were calculated in order to identify metabolic alterations. WHtR and BMI were compared to determine which had the best diagnostic efficiency. Of the 223 children included in the study, 51 had hypertriglyceridaemia, 27 with hypercholesterolaemia, and 9 with hyperglycaemia. On comparing the diagnostic efficiency of WHtR with that of BMI, there was a sensitivity of 100% vs. 56% for hyperglycaemia, 93 vs. 70% for cholesterol, and 76 vs. 59% for hypertriglyceridaemia. The specificity, negative predictive value, positive predictive value, positive likelihood ratio, negative likelihood ratio, and area under curve were also higher for WHtR. The WHtR is a more efficient indicator than BMI in identifying metabolic risk in mexican school-age. Copyright © 2015 Sociedad Chilena de Pediatría. Publicado por Elsevier España, S.L.U. All rights reserved.
Utility of QR codes in biological collections
Diazgranados, Mauricio; Funk, Vicki A.
2013-01-01
Abstract The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers’ electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections. PMID:24198709
Utility of QR codes in biological collections.
Diazgranados, Mauricio; Funk, Vicki A
2013-01-01
The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers' electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections.
Error-trellis Syndrome Decoding Techniques for Convolutional Codes
NASA Technical Reports Server (NTRS)
Reed, I. S.; Truong, T. K.
1984-01-01
An error-trellis syndrome decoding technique for convolutional codes is developed. This algorithm is then applied to the entire class of systematic convolutional codes and to the high-rate, Wyner-Ash convolutional codes. A special example of the one-error-correcting Wyner-Ash code, a rate 3/4 code, is treated. The error-trellis syndrome decoding method applied to this example shows in detail how much more efficient syndrome decoding is than Viterbi decoding if applied to the same problem. For standard Viterbi decoding, 64 states are required, whereas in the example only 7 states are needed. Also, within the 7 states required for decoding, many fewer transitions are needed between the states.
Error-trellis syndrome decoding techniques for convolutional codes
NASA Technical Reports Server (NTRS)
Reed, I. S.; Truong, T. K.
1985-01-01
An error-trellis syndrome decoding technique for convolutional codes is developed. This algorithm is then applied to the entire class of systematic convolutional codes and to the high-rate, Wyner-Ash convolutional codes. A special example of the one-error-correcting Wyner-Ash code, a rate 3/4 code, is treated. The error-trellis syndrome decoding method applied to this example shows in detail how much more efficient syndrome decordig is than Viterbi decoding if applied to the same problem. For standard Viterbi decoding, 64 states are required, whereas in the example only 7 states are needed. Also, within the 7 states required for decoding, many fewer transitions are needed between the states.
Verification testing of the compression performance of the HEVC screen content coding extensions
NASA Astrophysics Data System (ADS)
Sullivan, Gary J.; Baroncini, Vittorio A.; Yu, Haoping; Joshi, Rajan L.; Liu, Shan; Xiu, Xiaoyu; Xu, Jizheng
2017-09-01
This paper reports on verification testing of the coding performance of the screen content coding (SCC) extensions of the High Efficiency Video Coding (HEVC) standard (Rec. ITU-T H.265 | ISO/IEC 23008-2 MPEG-H Part 2). The coding performance of HEVC screen content model (SCM) reference software is compared with that of the HEVC test model (HM) without the SCC extensions, as well as with the Advanced Video Coding (AVC) joint model (JM) reference software, for both lossy and mathematically lossless compression using All-Intra (AI), Random Access (RA), and Lowdelay B (LB) encoding structures and using similar encoding techniques. Video test sequences in 1920×1080 RGB 4:4:4, YCbCr 4:4:4, and YCbCr 4:2:0 colour sampling formats with 8 bits per sample are tested in two categories: "text and graphics with motion" (TGM) and "mixed" content. For lossless coding, the encodings are evaluated in terms of relative bit-rate savings. For lossy compression, subjective testing was conducted at 4 quality levels for each coding case, and the test results are presented through mean opinion score (MOS) curves. The relative coding performance is also evaluated in terms of Bjøntegaard-delta (BD) bit-rate savings for equal PSNR quality. The perceptual tests and objective metric measurements show a very substantial benefit in coding efficiency for the SCC extensions, and provided consistent results with a high degree of confidence. For TGM video, the estimated bit-rate savings ranged from 60-90% relative to the JM and 40-80% relative to the HM, depending on the AI/RA/LB configuration category and colour sampling format.
Ionisation Feedback in Star and Cluster Formation Simulations
NASA Astrophysics Data System (ADS)
Ercolano, Barbara; Gritschneder, Matthias
2011-04-01
Feedback from photoionisation may dominate on parsec scales in massive star-forming regions. Such feedback may inhibit or enhance the star formation efficiency and sustain or even drive turbulence in the parent molecular cloud. Photoionisation feedback may also provide a mechanism for the rapid expulsion of gas from young clusters' potentials, often invoked as the main cause of `infant mortality'. There is currently no agreement, however, with regards to the efficiency of this process and how environment may affect the direction (positive or negative) in which it proceeds. The study of the photoionisation process as part of hydrodynamical simulations is key to understanding these issues, however, due to the computational demand of the problem, crude approximations for the radiation transfer are often employed. We will briefly review some of the most commonly used approximations and discuss their major drawbacks. We will then present the results of detailed tests carried out using the detailed photoionisation code mocassin and the SPH+ionisation code iVINE code, aimed at understanding the error introduced by the simplified photoionisation algorithms. This is particularly relevant as a number of new codes have recently been developed along those lines. We will finally propose a new approach that should allow to efficiently and self-consistently treat the photoionisation problem for complex radiation and density fields.
Onboard Image Processing System for Hyperspectral Sensor
Hihara, Hiroki; Moritani, Kotaro; Inoue, Masao; Hoshi, Yoshihiro; Iwasaki, Akira; Takada, Jun; Inada, Hitomi; Suzuki, Makoto; Seki, Taeko; Ichikawa, Satoshi; Tanii, Jun
2015-01-01
Onboard image processing systems for a hyperspectral sensor have been developed in order to maximize image data transmission efficiency for large volume and high speed data downlink capacity. Since more than 100 channels are required for hyperspectral sensors on Earth observation satellites, fast and small-footprint lossless image compression capability is essential for reducing the size and weight of a sensor system. A fast lossless image compression algorithm has been developed, and is implemented in the onboard correction circuitry of sensitivity and linearity of Complementary Metal Oxide Semiconductor (CMOS) sensors in order to maximize the compression ratio. The employed image compression method is based on Fast, Efficient, Lossless Image compression System (FELICS), which is a hierarchical predictive coding method with resolution scaling. To improve FELICS’s performance of image decorrelation and entropy coding, we apply a two-dimensional interpolation prediction and adaptive Golomb-Rice coding. It supports progressive decompression using resolution scaling while still maintaining superior performance measured as speed and complexity. Coding efficiency and compression speed enlarge the effective capacity of signal transmission channels, which lead to reducing onboard hardware by multiplexing sensor signals into a reduced number of compression circuits. The circuitry is embedded into the data formatter of the sensor system without adding size, weight, power consumption, and fabrication cost. PMID:26404281
The efficiency of geophysical adjoint codes generated by automatic differentiation tools
NASA Astrophysics Data System (ADS)
Vlasenko, A. V.; Köhl, A.; Stammer, D.
2016-02-01
The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.
Shaw, Rahul; Kundu, Sudip
2015-01-01
More than 20% of the total caloric intake of human population comes from rice. The expression of rice genes and hence, the concentration of enzymatic proteins might vary due to several biotic and abiotic stresses. It in turn, can influence the overall metabolism and survivability of rice plant. Thus, understanding the rice cellular metabolism, its plasticity and potential readjustments under different perturbations can help rice biotechnologists to design efficient rice cultivars. Here, using the flux balance analysis (FBA) method, with the help of in-silico reaction deletion strategy, we study the metabolic plasticity of genome-scale metabolic model of rice leaf. A set of 131 reactions, essential for the production of primary biomass precursors is identified; deletion of any of them can inhibit the overall biomass production. Usability Index (IU) for the rest of the reactions are estimated and based on this parameter, they are classified into three categories-maximally-favourable, quasi-favourable and unfavourable for the primary biomass production. The lower value of 1 - IU of a reaction suggests that the cell cannot easily bypass it for biomass production. While some of the alternative paths are energetically equally efficient, others demand for higher photon. The variations in (i) ATP/NADPH ratio, (ii) exchange of metabolites through chloroplastic transporters and (iii) total biomass production are also presented here. Mutual metabolic dependencies of different cellular compartments are also demonstrated.
Johannsen, Darcy L.; Galgani, Jose E.; Johannsen, Neil M.; Zhang, Zhengyu; Covington, Jeffrey D.; Ravussin, Eric
2012-01-01
The physiologic effects of triiodothyronine (T3) on metabolic rate are well-documented; however, the effects of thyroxine (T4) are less clear despite its wide-spread use to treat thyroid-related disorders and other non-thyroidal conditions. Here, we investigated the effects of acute (3-day) T4 supplementation on energy expenditure at rest and during incremental exercise. Furthermore, we used a combination of in situ and in vitro approaches to measure skeletal muscle metabolism before and after T4 treatment. Ten healthy, euthyroid males were given 200 µg T4 (levothyroxine) per day for 3 days. Energy expenditure was measured at rest and during exercise by indirect calorimetry, and skeletal muscle mitochondrial function was assessed by in situ ATP flux (31P MRS) and in vitro respiratory control ratio (RCR, state 3/state 4 rate of oxygen uptake using a Clark-type electrode) before and after acute T4 treatment. Thyroxine had a subtle effect on resting metabolic rate, increasing it by 4% (p = 0.059) without a change in resting ATP demand (i.e., ATP flux) of the vastus lateralis. Exercise efficiency did not change with T4 treatment. The maximal capacity to produce ATP (state 3 respiration) and the coupled state of the mitochondria (RCR) were reduced by approximately 30% with T4 (p = 0.057 and p = 0.04, respectively). Together, the results suggest that T4, although less metabolically active than T3, reduces skeletal muscle efficiency and modestly increases resting metabolism even after short-term supplementation. Our findings may be clinically relevant given the expanding application of T4 to treat non-thyroidal conditions such as obesity and weight loss. PMID:22844412
78 FR 33838 - DOE Participation in Development of the International Energy Conservation Code
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-05
... DEPARTMENT OF ENERGY Office of Energy Efficiency and Renewable Energy [Docket No. EERE-2012-BT-BC... Energy Efficiency and Renewable Energy, Department of Energy. ACTION: Notice and request for comment... Efficiency and Renewable Energy, Building Technologies Office, Mailstop EE-2J, 1000 Independence Avenue SW...
78 FR 55245 - Activities and Methodology for Assessing Compliance With Building Energy Codes
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-10
... DEPARTMENT OF ENERGY Office of Energy Efficiency and Renewable Energy [Docket No. EERE-2013-BT-BC... Energy Efficiency and Renewable Energy, Department of Energy. ACTION: Notice of reopening of public..., Office of Energy Efficiency and Renewable Energy, Building Technologies Program, Mailstop EE-2J, 1000...
Turbofan Duct Propagation Model
NASA Technical Reports Server (NTRS)
Lan, Justin H.; Posey, Joe W. (Technical Monitor)
2001-01-01
The CDUCT code utilizes a parabolic approximation to the convected Helmholtz equation in order to efficiently model acoustic propagation in acoustically treated, complex shaped ducts. The parabolic approximation solves one-way wave propagation with a marching method which neglects backwards reflected waves. The derivation of the parabolic approximation is presented. Several code validation cases are given. An acoustic lining design process for an example aft fan duct is discussed. It is noted that the method can efficiently model realistic three-dimension effects, acoustic lining, and flow within the computational capabilities of a typical computer workstation.
Inhibited Carnitine Synthesis Causes Systemic Alteration of Nutrient Metabolism in Zebrafish
Li, Jia-Min; Li, Ling-Yu; Qin, Xuan; Degrace, Pascal; Demizieux, Laurent; Limbu, Samwel M.; Wang, Xin; Zhang, Mei-Ling; Li, Dong-Liang; Du, Zhen-Yu
2018-01-01
Impaired mitochondrial fatty acid β-oxidation has been correlated with many metabolic syndromes, and the metabolic characteristics of the mammalian models of mitochondrial dysfunction have also been intensively studied. However, the effects of the impaired mitochondrial fatty acid β-oxidation on systemic metabolism in teleost have never been investigated. In the present study, we established a low-carnitine zebrafish model by feeding fish with mildronate as a specific carnitine synthesis inhibitor [0.05% body weight (BW)/d] for 7 weeks, and the systemically changed nutrient metabolism, including carnitine and triglyceride (TG) concentrations, fatty acid (FA) β-oxidation capability, and other molecular and biochemical assays of lipid, glucose, and protein metabolism, were measured. The results indicated that mildronate markedly decreased hepatic carnitine concentrations while it had no effect in muscle. Liver TG concentrations increased by more than 50% in mildronate-treated fish. Mildronate decreased the efficiency of liver mitochondrial β-oxidation, increased the hepatic mRNA expression of genes related to FA β-oxidation and lipolysis, and decreased the expression of lipogenesis genes. Mildronate decreased whole body glycogen content, increased glucose metabolism rate, and upregulated the expression of glucose uptake and glycolysis genes. Mildronate also increased whole body protein content and hepatic mRNA expression of mechanistic target of rapamycin (mtor), and decreased the expression of a protein catabolism-related gene. Liver, rather than muscle, was the primary organ targeted by mildronate. In short, mildronate-induced hepatic inhibited carnitine synthesis in zebrafish caused decreased mitochondrial FA β-oxidation efficiency, greater lipid accumulation, and altered glucose and protein metabolism. This reveals the key roles of mitochondrial fatty acid β-oxidation in nutrient metabolism in fish, and this low-carnitine zebrafish model could also be used as a novel fish model for future metabolism studies. PMID:29867554
Inhibited Carnitine Synthesis Causes Systemic Alteration of Nutrient Metabolism in Zebrafish.
Li, Jia-Min; Li, Ling-Yu; Qin, Xuan; Degrace, Pascal; Demizieux, Laurent; Limbu, Samwel M; Wang, Xin; Zhang, Mei-Ling; Li, Dong-Liang; Du, Zhen-Yu
2018-01-01
Impaired mitochondrial fatty acid β-oxidation has been correlated with many metabolic syndromes, and the metabolic characteristics of the mammalian models of mitochondrial dysfunction have also been intensively studied. However, the effects of the impaired mitochondrial fatty acid β-oxidation on systemic metabolism in teleost have never been investigated. In the present study, we established a low-carnitine zebrafish model by feeding fish with mildronate as a specific carnitine synthesis inhibitor [0.05% body weight (BW)/d] for 7 weeks, and the systemically changed nutrient metabolism, including carnitine and triglyceride (TG) concentrations, fatty acid (FA) β-oxidation capability, and other molecular and biochemical assays of lipid, glucose, and protein metabolism, were measured. The results indicated that mildronate markedly decreased hepatic carnitine concentrations while it had no effect in muscle. Liver TG concentrations increased by more than 50% in mildronate-treated fish. Mildronate decreased the efficiency of liver mitochondrial β-oxidation, increased the hepatic mRNA expression of genes related to FA β-oxidation and lipolysis, and decreased the expression of lipogenesis genes. Mildronate decreased whole body glycogen content, increased glucose metabolism rate, and upregulated the expression of glucose uptake and glycolysis genes. Mildronate also increased whole body protein content and hepatic mRNA expression of mechanistic target of rapamycin ( mtor ), and decreased the expression of a protein catabolism-related gene. Liver, rather than muscle, was the primary organ targeted by mildronate. In short, mildronate-induced hepatic inhibited carnitine synthesis in zebrafish caused decreased mitochondrial FA β-oxidation efficiency, greater lipid accumulation, and altered glucose and protein metabolism. This reveals the key roles of mitochondrial fatty acid β-oxidation in nutrient metabolism in fish, and this low-carnitine zebrafish model could also be used as a novel fish model for future metabolism studies.