Sample records for evolutionary computation ec

  1. Evolutionary Technologies: Fundamentals and Applications to Information/Communication Systems and Manufacturing/Logistics Systems

    NASA Astrophysics Data System (ADS)

    Gen, Mitsuo; Kawakami, Hiroshi; Tsujimura, Yasuhiro; Handa, Hisashi; Lin, Lin; Okamoto, Azuma

    As efficient utilization of computational resources is increasing, evolutionary technology based on the Genetic Algorithm (GA), Genetic Programming (GP), Evolution Strategy (ES) and other Evolutionary Computations (ECs) is making rapid progress, and its social recognition and the need as applied technology are increasing. This is explained by the facts that EC offers higher robustness for knowledge information processing systems, intelligent production and logistics systems, most advanced production scheduling and other various real-world problems compared to the approaches based on conventional theories, and EC ensures flexible applicability and usefulness for any unknown system environment even in a case where accurate mathematical modeling fails in the formulation. In this paper, we provide a comprehensive survey of the current state-of-the-art in the fundamentals and applications of evolutionary technologies.

  2. Development of X-TOOLSS: Preliminary Design of Space Systems Using Evolutionary Computation

    NASA Technical Reports Server (NTRS)

    Schnell, Andrew R.; Hull, Patrick V.; Turner, Mike L.; Dozier, Gerry; Alverson, Lauren; Garrett, Aaron; Reneau, Jarred

    2008-01-01

    Evolutionary computational (EC) techniques such as genetic algorithms (GA) have been identified as promising methods to explore the design space of mechanical and electrical systems at the earliest stages of design. In this paper the authors summarize their research in the use of evolutionary computation to develop preliminary designs for various space systems. An evolutionary computational solver developed over the course of the research, X-TOOLSS (Exploration Toolset for the Optimization of Launch and Space Systems) is discussed. With the success of early, low-fidelity example problems, an outline of work involving more computationally complex models is discussed.

  3. Towards a Population Dynamics Theory for Evolutionary Computing: Learning from Biological Population Dynamics in Nature

    NASA Astrophysics Data System (ADS)

    Ma, Zhanshan (Sam)

    In evolutionary computing (EC), population size is one of the critical parameters that a researcher has to deal with. Hence, it was no surprise that the pioneers of EC, such as De Jong (1975) and Holland (1975), had already studied the population sizing from the very beginning of EC. What is perhaps surprising is that more than three decades later, we still largely depend on the experience or ad-hoc trial-and-error approach to set the population size. For example, in a recent monograph, Eiben and Smith (2003) indicated: "In almost all EC applications, the population size is constant and does not change during the evolutionary search." Despite enormous research on this issue in recent years, we still lack a well accepted theory for population sizing. In this paper, I propose to develop a population dynamics theory forEC with the inspiration from the population dynamics theory of biological populations in nature. Essentially, the EC population is considered as a dynamic system over time (generations) and space (search space or fitness landscape), similar to the spatial and temporal dynamics of biological populations in nature. With this conceptual mapping, I propose to 'transplant' the biological population dynamics theory to EC via three steps: (i) experimentally test the feasibility—whether or not emulating natural population dynamics improves the EC performance; (ii) comparatively study the underlying mechanisms—why there are improvements, primarily via statistical modeling analysis; (iii) conduct theoretical analysis with theoretical models such as percolation theory and extended evolutionary game theory that are generally applicable to both EC and natural populations. This article is a summary of a series of studies we have performed to achieve the general goal [27][30]-[32]. In the following, I start with an extremely brief introduction on the theory and models of natural population dynamics (Sections 1 & 2). In Sections 4 to 6, I briefly discuss three categories of population dynamics models: deterministic modeling with Logistic chaos map as an example, stochastic modeling with spatial distribution patterns as an example, as well as survival analysis and extended evolutionary game theory (EEGT) modeling. Sample experiment results with Genetic algorithms (GA) are presented to demonstrate the applications of these models. The proposed EC population dynamics approach also makes survival selection largely unnecessary or much simplified since the individuals are naturally selected (controlled) by the mathematical models for EC population dynamics.

  4. Evolving Agents as a Metaphor for the Developing Child

    ERIC Educational Resources Information Center

    Schlesinger, Matthew

    2004-01-01

    The emerging field of Evolutionary Computation (EC), inspired by neo-Darwinian principles (e.g. natural selection, mutation, etc.), offers developmental psychologists a wide array of mathematical tools for simulating ontogenetic processes. In this brief review, I begin by highlighting three of the approaches that EC researchers employ (Artificial…

  5. Using evolutionary computations to understand the design and evolution of gene and cell regulatory networks.

    PubMed

    Spirov, Alexander; Holloway, David

    2013-07-15

    This paper surveys modeling approaches for studying the evolution of gene regulatory networks (GRNs). Modeling of the design or 'wiring' of GRNs has become increasingly common in developmental and medical biology, as a means of quantifying gene-gene interactions, the response to perturbations, and the overall dynamic motifs of networks. Drawing from developments in GRN 'design' modeling, a number of groups are now using simulations to study how GRNs evolve, both for comparative genomics and to uncover general principles of evolutionary processes. Such work can generally be termed evolution in silico. Complementary to these biologically-focused approaches, a now well-established field of computer science is Evolutionary Computations (ECs), in which highly efficient optimization techniques are inspired from evolutionary principles. In surveying biological simulation approaches, we discuss the considerations that must be taken with respect to: (a) the precision and completeness of the data (e.g. are the simulations for very close matches to anatomical data, or are they for more general exploration of evolutionary principles); (b) the level of detail to model (we proceed from 'coarse-grained' evolution of simple gene-gene interactions to 'fine-grained' evolution at the DNA sequence level); (c) to what degree is it important to include the genome's cellular context; and (d) the efficiency of computation. With respect to the latter, we argue that developments in computer science EC offer the means to perform more complete simulation searches, and will lead to more comprehensive biological predictions. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Characterizing Conformational Dynamics of Proteins Using Evolutionary Couplings.

    PubMed

    Feng, Jiangyan; Shukla, Diwakar

    2018-01-25

    Understanding of protein conformational dynamics is essential for elucidating molecular origins of protein structure-function relationship. Traditionally, reaction coordinates, i.e., some functions of protein atom positions and velocities have been used to interpret the complex dynamics of proteins obtained from experimental and computational approaches such as molecular dynamics simulations. However, it is nontrivial to identify the reaction coordinates a priori even for small proteins. Here, we evaluate the power of evolutionary couplings (ECs) to capture protein dynamics by exploring their use as reaction coordinates, which can efficiently guide the sampling of a conformational free energy landscape. We have analyzed 10 diverse proteins and shown that a few ECs are sufficient to characterize complex conformational dynamics of proteins involved in folding and conformational change processes. With the rapid strides in sequencing technology, we expect that ECs could help identify reaction coordinates a priori and enhance the sampling of the slow dynamical process associated with protein folding and conformational change.

  7. Hybrid Architectures for Evolutionary Computing Algorithms

    DTIC Science & Technology

    2008-01-01

    other EC algorithms to FPGA Core Burns P1026/MAPLD 200532 Genetic Algorithm Hardware References S. Scott, A. Samal , and S. Seth, “HGA: A Hardware Based...on Parallel and Distributed Processing (IPPS/SPDP 󈨦), pp. 316-320, Proceedings. IEEE Computer Society 1998. [12] Scott, S. D. , Samal , A., and...Algorithm Hardware References S. Scott, A. Samal , and S. Seth, “HGA: A Hardware Based Genetic Algorithm”, Proceedings of the 1995 ACM Third

  8. Gender approaches to evolutionary multi-objective optimization using pre-selection of criteria

    NASA Astrophysics Data System (ADS)

    Kowalczuk, Zdzisław; Białaszewski, Tomasz

    2018-01-01

    A novel idea to perform evolutionary computations (ECs) for solving highly dimensional multi-objective optimization (MOO) problems is proposed. Following the general idea of evolution, it is proposed that information about gender is used to distinguish between various groups of objectives and identify the (aggregate) nature of optimality of individuals (solutions). This identification is drawn out of the fitness of individuals and applied during parental crossover in the processes of evolutionary multi-objective optimization (EMOO). The article introduces the principles of the genetic-gender approach (GGA) and virtual gender approach (VGA), which are not just evolutionary techniques, but constitute a completely new rule (philosophy) for use in solving MOO tasks. The proposed approaches are validated against principal representatives of the EMOO algorithms of the state of the art in solving benchmark problems in the light of recognized EC performance criteria. The research shows the superiority of the gender approach in terms of effectiveness, reliability, transparency, intelligibility and MOO problem simplification, resulting in the great usefulness and practicability of GGA and VGA. Moreover, an important feature of GGA and VGA is that they alleviate the 'curse' of dimensionality typical of many engineering designs.

  9. Kernel Method Based Human Model for Enhancing Interactive Evolutionary Optimization

    PubMed Central

    Zhao, Qiangfu; Liu, Yong

    2015-01-01

    A fitness landscape presents the relationship between individual and its reproductive success in evolutionary computation (EC). However, discrete and approximate landscape in an original search space may not support enough and accurate information for EC search, especially in interactive EC (IEC). The fitness landscape of human subjective evaluation in IEC is very difficult and impossible to model, even with a hypothesis of what its definition might be. In this paper, we propose a method to establish a human model in projected high dimensional search space by kernel classification for enhancing IEC search. Because bivalent logic is a simplest perceptual paradigm, the human model is established by considering this paradigm principle. In feature space, we design a linear classifier as a human model to obtain user preference knowledge, which cannot be supported linearly in original discrete search space. The human model is established by this method for predicting potential perceptual knowledge of human. With the human model, we design an evolution control method to enhance IEC search. From experimental evaluation results with a pseudo-IEC user, our proposed model and method can enhance IEC search significantly. PMID:25879050

  10. Exploiting Genomic Knowledge in Optimising Molecular Breeding Programmes: Algorithms from Evolutionary Computing

    PubMed Central

    O'Hagan, Steve; Knowles, Joshua; Kell, Douglas B.

    2012-01-01

    Comparatively few studies have addressed directly the question of quantifying the benefits to be had from using molecular genetic markers in experimental breeding programmes (e.g. for improved crops and livestock), nor the question of which organisms should be mated with each other to best effect. We argue that this requires in silico modelling, an approach for which there is a large literature in the field of evolutionary computation (EC), but which has not really been applied in this way to experimental breeding programmes. EC seeks to optimise measurable outcomes (phenotypic fitnesses) by optimising in silico the mutation, recombination and selection regimes that are used. We review some of the approaches from EC, and compare experimentally, using a biologically relevant in silico landscape, some algorithms that have knowledge of where they are in the (genotypic) search space (G-algorithms) with some (albeit well-tuned ones) that do not (F-algorithms). For the present kinds of landscapes, F- and G-algorithms were broadly comparable in quality and effectiveness, although we recognise that the G-algorithms were not equipped with any ‘prior knowledge’ of epistatic pathway interactions. This use of algorithms based on machine learning has important implications for the optimisation of experimental breeding programmes in the post-genomic era when we shall potentially have access to the full genome sequence of every organism in a breeding population. The non-proprietary code that we have used is made freely available (via Supplementary information). PMID:23185279

  11. An Evolutionary Computation Approach to Examine Functional Brain Plasticity.

    PubMed

    Roy, Arnab; Campbell, Colin; Bernier, Rachel A; Hillary, Frank G

    2016-01-01

    One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs) evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC) based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair) such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN) and the executive control network (ECN) during recovery from traumatic brain injury (TBI); the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in the strength of functional relationship between DMN and ECN for TBI subjects, which is consistent with prior findings in the TBI-literature. The EC-approach also allowed us to separate sub-regional-pairs contributing to positive and negative plasticity; the detected sub-regional-pairs significantly overlap across runs thus highlighting the reliability of the EC-approach. These sub-regional-pairs may be useful in performing nuanced analyses of brain-behavior relationships during recovery from TBI.

  12. A tale of three bio-inspired computational approaches

    NASA Astrophysics Data System (ADS)

    Schaffer, J. David

    2014-05-01

    I will provide a high level walk-through for three computational approaches derived from Nature. First, evolutionary computation implements what we may call the "mother of all adaptive processes." Some variants on the basic algorithms will be sketched and some lessons I have gleaned from three decades of working with EC will be covered. Then neural networks, computational approaches that have long been studied as possible ways to make "thinking machines", an old dream of man's, and based upon the only known existing example of intelligence. Then, a little overview of attempts to combine these two approaches that some hope will allow us to evolve machines we could never hand-craft. Finally, I will touch on artificial immune systems, Nature's highly sophisticated defense mechanism, that has emerged in two major stages, the innate and the adaptive immune systems. This technology is finding applications in the cyber security world.

  13. Thermodynamic System Drift in Protein Evolution

    PubMed Central

    Hart, Kathryn M.; Harms, Michael J.; Schmidt, Bryan H.; Elya, Carolyn; Thornton, Joseph W.; Marqusee, Susan

    2014-01-01

    Proteins from thermophiles are generally more thermostable than their mesophilic homologs, but little is known about the evolutionary process driving these differences. Here we attempt to understand how the diverse thermostabilities of bacterial ribonuclease H1 (RNH) proteins evolved. RNH proteins from Thermus thermophilus (ttRNH) and Escherichia coli (ecRNH) share similar structures but differ in melting temperature (Tm) by 20°C. ttRNH's greater stability is caused in part by the presence of residual structure in the unfolded state, which results in a low heat capacity of unfolding (ΔCp) relative to ecRNH. We first characterized RNH proteins from a variety of extant bacteria and found that Tm correlates with the species' growth temperatures, consistent with environmental selection for stability. We then used ancestral sequence reconstruction to statistically infer evolutionary intermediates along lineages leading to ecRNH and ttRNH from their common ancestor, which existed approximately 3 billion years ago. Finally, we synthesized and experimentally characterized these intermediates. The shared ancestor has a melting temperature between those of ttRNH and ecRNH; the Tms of intermediate ancestors along the ttRNH lineage increased gradually over time, while the ecRNH lineage exhibited an abrupt drop in Tm followed by relatively little change. To determine whether the underlying mechanisms for thermostability correlate with the changes in Tm, we measured the thermodynamic basis for stabilization—ΔCp and other thermodynamic parameters—for each of the ancestors. We observed that, while the Tm changes smoothly, the mechanistic basis for stability fluctuates over evolutionary time. Thus, even while overall stability appears to be strongly driven by selection, the proteins explored a wide variety of mechanisms of stabilization, a phenomenon we call “thermodynamic system drift.” This suggests that even on lineages with strong selection to increase stability, proteins have wide latitude to explore sequence space, generating biophysical diversity and potentially opening new evolutionary pathways. PMID:25386647

  14. Comparative genomics of enterohemorrhagic Escherichia coli O145:H28 demonstrates a common evolutionary lineage with Escherichia coli O157:H7

    PubMed Central

    2014-01-01

    Background Although serotype O157:H7 is the predominant enterohemorrhagic Escherichia coli (EHEC), outbreaks of non-O157 EHEC that cause severe foodborne illness, including hemolytic uremic syndrome have increased worldwide. In fact, non-O157 serotypes are now estimated to cause over half of all the Shiga toxin-producing Escherichia coli (STEC) cases, and outbreaks of non-O157 EHEC infections are frequently associated with serotypes O26, O45, O103, O111, O121, and O145. Currently, there are no complete genomes for O145 in public databases. Results We determined the complete genome sequences of two O145 strains (EcO145), one linked to a US lettuce-associated outbreak (RM13514) and one to a Belgium ice-cream-associated outbreak (RM13516). Both strains contain one chromosome and two large plasmids, with genome sizes of 5,737,294 bp for RM13514 and 5,559,008 bp for RM13516. Comparative analysis of the two EcO145 genomes revealed a large core (5,173 genes) and a considerable amount of strain-specific genes. Additionally, the two EcO145 genomes display distinct chromosomal architecture, virulence gene profile, phylogenetic origin of Stx2a prophage, and methylation profile (methylome). Comparative analysis of EcO145 genomes to other completely sequenced STEC and other E. coli and Shigella genomes revealed that, unlike any other known non-O157 EHEC strain, EcO145 ascended from a common lineage with EcO157/EcO55. This evolutionary relationship was further supported by the pangenome analysis of the 10 EHEC str ains. Of the 4,192 EHEC core genes, EcO145 shares more genes with EcO157 than with the any other non-O157 EHEC strains. Conclusions Our data provide evidence that EcO145 and EcO157 evolved from a common lineage, but ultimately each serotype evolves via a lineage-independent nature to EHEC by acquisition of the core set of EHEC virulence factors, including the genes encoding Shiga toxin and the large virulence plasmid. The large variation between the two EcO145 genomes suggests a distinctive evolutionary path between the two outbreak strains. The distinct methylome between the two EcO145 strains is likely due to the presence of a BsuBI/PstI methyltransferase gene cassette in the Stx2a prophage of the strain RM13514, suggesting a role of horizontal gene transfer-mediated epigenetic alteration in the evolution of individual EHEC strains. PMID:24410921

  15. Transcription factor COUP-TFII is indispensable for venous and lymphatic development in zebrafish and Xenopus laevis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aranguren, Xabier L., E-mail: xabier.lopezaranguren@med.kuleuven.be; Beerens, Manu, E-mail: manu.beerens@med.kuleuven.be; Vandevelde, Wouter, E-mail: woutervandevelde@gmail.com

    Highlights: {yields} COUP-TFII deficiency in zebrafish affects arterio-venous EC specification. {yields} COUP-TFII is indispensable for lymphatic development in zebrafish. {yields} COUP-TFII knockdown in Xenopus disrupts lymphatic EC differentiation and migration. {yields} COUP-TFII's role in EC fate decisions is evolutionary conserved. -- Abstract: Transcription factors play a central role in cell fate determination. Gene targeting in mice revealed that Chicken Ovalbumin Upstream Promoter-Transcription Factor II (COUP-TFII, also known as Nuclear Receptor 2F2 or NR2F2) induces a venous phenotype in endothelial cells (ECs). More recently, NR2F2 was shown to be required for initiating the expression of Prox1, responsible for lymphatic commitment ofmore » venous ECs. Small animal models like zebrafish embryos and Xenopus laevis tadpoles have been very useful to elucidate mechanisms of (lymph) vascular development. Therefore, the role of NR2F2 in (lymph) vascular development was studied by eliminating its expression in these models. Like in mice, absence of NR2F2 in zebrafish resulted in distinct vascular defects including loss of venous marker expression, major trunk vessel fusion and vascular leakage. Both in zebrafish and Xenopus the development of the main lymphatic structures was severely hampered. NR2F2 knockdown significantly decreased prox1 expression in zebrafish ECs and the same manipulation affected lymphatic (L)EC commitment, migration and function in Xenopus tadpoles. Therefore, the role of NR2F2 in EC fate determination is evolutionary conserved.« less

  16. PROBING ELECTRON-CAPTURE SUPERNOVAE: X-RAY BINARIES IN STARBURSTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linden, T.; Sepinsky, J. F.; Kalogera, V.

    We develop population models of high-mass X-ray binaries (HMXBs) formed after bursts of star formation and we investigate the effect of electron-capture supernovae (ECS) of massive ONeMg white dwarfs and the hypothesis that ECS events are associated with typically low supernova kicks imparted to the nascent neutron stars. We identify an interesting ECS bump in the time evolution of HMXB numbers; this bump is caused by significantly increased production of wind-fed HMXBs 20-60 Myr post-starburst. The amplitude and age extent of the ECS bump depend on the strength of ECS kicks and the mass range of ECS progenitors. We alsomore » find that ECS-HMXBs form through a specific evolutionary channel that is expected to lead to binaries with Be donors in wide orbits. These characteristics, along with their sensitivity to ECS properties, provide us with an intriguing opportunity to probe ECS physics and progenitors through studies of starbursts of different ages. Specifically, the case of the Small Magellanic Cloud, with a significant observed population of Be-HMXBs and starburst activity 30-60 Myr ago, arises as a promising laboratory for understanding the role of ECS in neutron star formation.« less

  17. Molecular anatomy of the developing limb in the coquí frog, Eleutherodactylus coqui.

    PubMed

    Gross, Joshua B; Kerney, Ryan; Hanken, James; Tabin, Clifford J

    2011-01-01

    The vertebrate limb demonstrates remarkable similarity in basic organization across phylogenetically disparate groups. To gain further insight into how this morphological similarity is maintained in different developmental contexts, we explored the molecular anatomy of size-reduced embryos of the Puerto Rican coquí frog, Eleutherodactylus coqui. This animal demonstrates direct development, a life-history strategy marked by rapid progression from egg to adult and absence of a free-living, aquatic larva. Nonetheless, coquí exhibits a basal anuran limb structure, with four toes on the forelimb and five toes on the hind limb. We investigated the extent to which coquí limb bud development conforms to the model of limb development derived from amniote studies. Toward this end, we characterized dynamic patterns of expression for 13 critical patterning genes across three principle stages of limb development. As expected, most genes demonstrate expression patterns that are essentially unchanged compared to amniote species. For example, we identified an EcFgf8-expression domain within the apical ectodermal ridge (AER). This expression pattern defines a putatively functional AER signaling domain, despite the absence of a morphological ridge in coquí embryos. However, two genes, EcMeis2 and EcAlx4, demonstrate altered domains of expression, which imply a potential shift in gene function between coquí frogs and amniote model systems. Unexpectedly, several genes thought to be critical for limb patterning in other systems, including EcFgf4, EcWnt3a, EcWnt7a, and EcGremlin, demonstrated no evident expression pattern in the limb at the three stages we analyzed. The absence of EcFgf4 and EcWnt3a expression during limb patterning is perhaps not surprising, given that neither gene is critical for proper limb development in the mouse, based on knockout and expression analyses. In contrast, absence of EcWnt7a and EcGremlin is surprising, given that expression of these molecules appears to be absolutely essential in all other model systems so far examined. Although this analysis substantiates the existence of a core set of ancient limb-patterning molecules, which likely mediate identical functions across highly diverse vertebrate forms, it also reveals remarkable evolutionary flexibility in the genetic control of a conserved morphological pattern across evolutionary time. © 2011 Wiley Periodicals, Inc.

  18. Cannabimimetic phytochemicals in the diet - an evolutionary link to food selection and metabolic stress adaptation?

    PubMed

    Gertsch, Jürg

    2017-06-01

    The endocannabinoid system (ECS) is a major lipid signalling network that plays important pro-homeostatic (allostatic) roles not only in the nervous system but also in peripheral organs. There is increasing evidence that there is a dietary component in the modulation of the ECS. Cannabinoid receptors in hominids co-evolved with diet, and the ECS constitutes a feedback loop for food selection and energy metabolism. Here, it is postulated that the mismatch of ancient lipid genes of hunter-gatherers and pastoralists with the high-carbohydrate diet introduced by agriculture could be compensated for via dietary modulation of the ECS. In addition to the fatty acid precursors of endocannabinoids, the potential role of dietary cannabimimetic phytochemicals in agriculturist nutrition is discussed. Dietary secondary metabolites from vegetables and spices able to enhance the activity of cannabinoid-type 2 (CB 2 ) receptors may provide adaptive metabolic advantages and counteract inflammation. In contrast, chronic CB 1 receptor activation in hedonic obese individuals may enhance pathophysiological processes related to hyperlipidaemia, diabetes, hepatorenal inflammation and cardiometabolic risk. Food able to modulate the CB 1 /CB 2 receptor activation ratio may thus play a role in the nutrition transition of Western high-calorie diets. In this review, the interplay between diet and the ECS is highlighted from an evolutionary perspective. The emerging potential of cannabimimetic food as a nutraceutical strategy is critically discussed. This article is part of a themed section on Principles of Pharmacological Research of Nutraceuticals. To view the other articles in this section visit http://onlinelibrary.wiley.com/doi/10.1111/bph.v174.11/issuetoc. © 2016 The British Pharmacological Society.

  19. Cannabimimetic phytochemicals in the diet – an evolutionary link to food selection and metabolic stress adaptation?*

    PubMed Central

    2017-01-01

    The endocannabinoid system (ECS) is a major lipid signalling network that plays important pro‐homeostatic (allostatic) roles not only in the nervous system but also in peripheral organs. There is increasing evidence that there is a dietary component in the modulation of the ECS. Cannabinoid receptors in hominids co‐evolved with diet, and the ECS constitutes a feedback loop for food selection and energy metabolism. Here, it is postulated that the mismatch of ancient lipid genes of hunter‐gatherers and pastoralists with the high‐carbohydrate diet introduced by agriculture could be compensated for via dietary modulation of the ECS. In addition to the fatty acid precursors of endocannabinoids, the potential role of dietary cannabimimetic phytochemicals in agriculturist nutrition is discussed. Dietary secondary metabolites from vegetables and spices able to enhance the activity of cannabinoid‐type 2 (CB2) receptors may provide adaptive metabolic advantages and counteract inflammation. In contrast, chronic CB1 receptor activation in hedonic obese individuals may enhance pathophysiological processes related to hyperlipidaemia, diabetes, hepatorenal inflammation and cardiometabolic risk. Food able to modulate the CB1/CB2 receptor activation ratio may thus play a role in the nutrition transition of Western high‐calorie diets. In this review, the interplay between diet and the ECS is highlighted from an evolutionary perspective. The emerging potential of cannabimimetic food as a nutraceutical strategy is critically discussed. Linked Articles This article is part of a themed section on Principles of Pharmacological Research of Nutraceuticals. To view the other articles in this section visit http://onlinelibrary.wiley.com/doi/10.1111/bph.v174.11/issuetoc PMID:27891602

  20. Structure and Sequence Analyses of Clustered Protocadherins Reveal Antiparallel Interactions that Mediate Homophilic Specificity.

    PubMed

    Nicoludis, John M; Lau, Sze-Yi; Schärfe, Charlotta P I; Marks, Debora S; Weihofen, Wilhelm A; Gaudet, Rachelle

    2015-11-03

    Clustered protocadherin (Pcdh) proteins mediate dendritic self-avoidance in neurons via specific homophilic interactions in their extracellular cadherin (EC) domains. We determined crystal structures of EC1-EC3, containing the homophilic specificity-determining region, of two mouse clustered Pcdh isoforms (PcdhγA1 and PcdhγC3) to investigate the nature of the homophilic interaction. Within the crystal lattices, we observe antiparallel interfaces consistent with a role in trans cell-cell contact. Antiparallel dimerization is supported by evolutionary correlations. Two interfaces, located primarily on EC2-EC3, involve distinctive clustered Pcdh structure and sequence motifs, lack predicted glycosylation sites, and contain residues highly conserved in orthologs but not paralogs, pointing toward their biological significance as homophilic interaction interfaces. These two interfaces are similar yet distinct, reflecting a possible difference in interaction architecture between clustered Pcdh subfamilies. These structures initiate a molecular understanding of clustered Pcdh assemblies that are required to produce functional neuronal networks. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Assessing in silico the recruitment and functional spectrum of bacterial enzymes from secondary metabolism.

    PubMed

    Veprinskiy, Valery; Heizinger, Leonhard; Plach, Maximilian G; Merkl, Rainer

    2017-01-26

    Microbes, plants, and fungi synthesize an enormous number of metabolites exhibiting rich chemical diversity. For a high-level classification, metabolism is subdivided into primary (PM) and secondary (SM) metabolism. SM products are often not essential for survival of the organism and it is generally assumed that SM enzymes stem from PM homologs. We wanted to assess evolutionary relationships and function of bona fide bacterial PM and SM enzymes. Thus, we analyzed the content of 1010 biosynthetic gene clusters (BGCs) from the MIBiG dataset; the encoded bacterial enzymes served as representatives of SM. The content of 15 bacterial genomes known not to harbor BGCs served as a representation of PM. Enzymes were categorized on their EC number and for these enzyme functions, frequencies were determined. The comparison of PM/SM frequencies indicates a certain preference for hydrolases (EC class 3) and ligases (EC class 6) in PM and of oxidoreductases (EC class 1) and lyases (EC class 4) in SM. Based on BLAST searches, we determined pairs of PM/SM homologs and their functional diversity. Oxidoreductases, transferases (EC class 2), lyases and isomerases (EC class 5) form a tightly interlinked network indicating that many protein folds can accommodate different functions in PM and SM. In contrast, the functional diversity of hydrolases and especially ligases is significantly limited in PM and SM. For the most direct comparison of PM/SM homologs, we restricted for each BGC the search to the content of the genome it comes from. For each homologous hit, the contribution of the genomic neighborhood to metabolic pathways was summarized in BGC-specific html-pages that are interlinked with KEGG; this dataset can be downloaded from https://www.bioinf.ur.de . Only few reaction chemistries are overrepresented in bacterial SM and at least 55% of the enzymatic functions present in BGCs possess PM homologs. Many SM enzymes arose in PM and Nature utilized the evolvability of enzymes similarly to establish novel functions both in PM and SM. Future work aimed at the elucidation of evolutionary routes that have interconverted a PM enzyme into an SM homolog can profit from our BGC-specific annotations.

  2. Cooperative combinatorial optimization: evolutionary computation case study.

    PubMed

    Burgin, Mark; Eberbach, Eugene

    2008-01-01

    This paper presents a formalization of the notion of cooperation and competition of multiple systems that work toward a common optimization goal of the population using evolutionary computation techniques. It is proved that evolutionary algorithms are more expressive than conventional recursive algorithms, such as Turing machines. Three classes of evolutionary computations are introduced and studied: bounded finite, unbounded finite, and infinite computations. Universal evolutionary algorithms are constructed. Such properties of evolutionary algorithms as completeness, optimality, and search decidability are examined. A natural extension of evolutionary Turing machine (ETM) model is proposed to properly reflect phenomena of cooperation and competition in the whole population.

  3. Algorithmic Mechanism Design of Evolutionary Computation.

    PubMed

    Pei, Yan

    2015-01-01

    We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm.

  4. Algorithmic Mechanism Design of Evolutionary Computation

    PubMed Central

    2015-01-01

    We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm. PMID:26257777

  5. The National Education Association's Educational Computer Service. An Assessment.

    ERIC Educational Resources Information Center

    Software Publishers Association, Washington, DC.

    The Educational Computer Service (ECS) of the National Education Association (NEA) evaluates and distributes educational software. An investigation of ECS was conducted by the Computer Education Committee of the Software Publishers Association (SPA) at the request of SPA members. The SPA found that the service, as it is presently structured, is…

  6. Advanced Communications Technology Satellite high burst rate link evaluation terminal experiment control and monitor software maintenance manual, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1992-01-01

    The Experiment Control and Monitor (EC&M) software was developed at NASA Lewis Research Center to support the Advanced Communications Technology Satellite (ACTS) High Burst Rate Link Evaluation Terminal (HBR-LET). The HBR-LET is an experimenter's terminal to communicate with the ACTS for various investigations by government agencies, universities, and industry. The EC&M software is one segment of the Control and Performance Monitoring (C&PM) software system of the HBR-LET. The EC&M software allows users to initialize, control, and monitor the instrumentation within the HBR-LET using a predefined sequence of commands. Besides instrument control, the C&PM software system is also responsible for computer communication between the HBR-LET and the ACTS NASA Ground Station and for uplink power control of the HBR-LET to demonstrate power augmentation during rain fade events. The EC&M Software User's Guide, Version 1.0 (NASA-CR-189160) outlines the commands required to install and operate the EC&M software. Input and output file descriptions, operator commands, and error recovery procedures are discussed in the document. The EC&M Software Maintenance Manual, Version 1.0 (NASA-CR-189161) is a programmer's guide that describes current implementation of the EC&M software from a technical perspective. An overview of the EC&M software, computer algorithms, format representation, and computer hardware configuration are included in the manual.

  7. Wearable biosensor systems and resilience: a perfect storm in health care?

    PubMed

    Drury, Robert L

    2014-01-01

    We begin by placing our discussion in the context of the chronic crisis in medical care, noting key features, including economic, safety and conceptual challenges. Then we review the most promising elements of a broadened conceptual approach to health and wellbeing, which include an expanded role for psychological, social, cultural, spiritual and environmental variables. The contributions of positive and evolutionary psychology, complex adaptive systems theory, genomics and neuroscience are described and the rapidly developing synthetic field of resilience as a catalytic unifying development is traced in some detail, including analysis of the rapidly growing empirical literature on resilience and its constituents, particularly heart rate variability. Finally, a review of the use of miniaturized ambulatory data collection, analysis and self-management and health management systems points out an exemplar, the Extensive Care System (ECS), which takes advantage of the continuing advances in biosensor technology, computing power, networking dynamics and social media to facilitate not only personalized health and wellbeing, but higher quality evidence-based preventive, treatment and epidemiological outcomes. This development will challenge the acute care episode model typified by the ER or ICU stay and replace it with an ECS capable of facilitating not only healthy autonomic functioning, but both ipsative/individual and normative/population health.

  8. Wearable biosensor systems and resilience: a perfect storm in health care?

    PubMed Central

    Drury, Robert L.

    2014-01-01

    We begin by placing our discussion in the context of the chronic crisis in medical care, noting key features, including economic, safety and conceptual challenges. Then we review the most promising elements of a broadened conceptual approach to health and wellbeing, which include an expanded role for psychological, social, cultural, spiritual and environmental variables. The contributions of positive and evolutionary psychology, complex adaptive systems theory, genomics and neuroscience are described and the rapidly developing synthetic field of resilience as a catalytic unifying development is traced in some detail, including analysis of the rapidly growing empirical literature on resilience and its constituents, particularly heart rate variability. Finally, a review of the use of miniaturized ambulatory data collection, analysis and self-management and health management systems points out an exemplar, the Extensive Care System (ECS), which takes advantage of the continuing advances in biosensor technology, computing power, networking dynamics and social media to facilitate not only personalized health and wellbeing, but higher quality evidence-based preventive, treatment and epidemiological outcomes. This development will challenge the acute care episode model typified by the ER or ICU stay and replace it with an ECS capable of facilitating not only healthy autonomic functioning, but both ipsative/individual and normative/population health. PMID:25147531

  9. High Performance Computing (HPC) Innovation Service Portal Pilots Cloud Computing (HPC-ISP Pilot Cloud Computing)

    DTIC Science & Technology

    2011-08-01

    5 Figure 4 Architetural diagram of running Blender on Amazon EC2 through Nimbis...classification of streaming data. Example input images (top left). All digit prototypes (cluster centers) found, with size proportional to frequency (top...Figure 4 Architetural diagram of running Blender on Amazon EC2 through Nimbis 1 http

  10. Computed tomography detection of extracapsular spread of squamous cell carcinoma of the head and neck in metastatic cervical lymph nodes.

    PubMed

    Carlton, Joshua A; Maxwell, Adam W; Bauer, Lyndsey B; McElroy, Sara M; Layfield, Lester J; Ahsan, Humera; Agarwal, Ajay

    2017-06-01

    Background and purpose In patients with squamous cell carcinoma of the head and neck (HNSCC), extracapsular spread (ECS) of metastases in cervical lymph nodes affects prognosis and therapy. We assessed the accuracy of intravenous contrast-enhanced computed tomography (CT) and the utility of imaging criteria for preoperative detection of ECS in metastatic cervical lymph nodes in patients with HNSCC. Materials and methods Preoperative intravenous contrast-enhanced neck CT images of 93 patients with histopathological HNSCC metastatic nodes were retrospectively assessed by two neuroradiologists for ECS status and ECS imaging criteria. Radiological assessments were compared with histopathological assessments of neck dissection specimens, and interobserver agreement of ECS status and ECS imaging criteria were measured. Results Sensitivity, specificity, positive predictive value, and accuracy for overall ECS assessment were 57%, 81%, 82% and 67% for observer 1, and 66%, 76%, 80% and 70% for observer 2, respectively. Correlating three or more ECS imaging criteria with histopathological ECS increased specificity and positive predictive value, but decreased sensitivity and accuracy. Interobserver agreement for overall ECS assessment demonstrated a kappa of 0.59. Central necrosis had the highest kappa of 0.74. Conclusion CT has moderate specificity for ECS assessment in HNSCC metastatic cervical nodes. Identifying three or more ECS imaging criteria raises specificity and positive predictive value, therefore preoperative identification of multiple criteria may be clinically useful. Interobserver agreement is moderate for overall ECS assessment, substantial for central necrosis. Other ECS CT criteria had moderate agreement at best and therefore should not be used individually as criteria for detecting ECS by CT.

  11. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    NASA Astrophysics Data System (ADS)

    Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.

    2011-12-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  12. Inductive reasoning and forecasting of population dynamics of Cylindrospermopsis raciborskii in three sub-tropical reservoirs by evolutionary computation.

    PubMed

    Recknagel, Friedrich; Orr, Philip T; Cao, Hongqing

    2014-01-01

    Seven-day-ahead forecasting models of Cylindrospermopsis raciborskii in three warm-monomictic and mesotrophic reservoirs in south-east Queensland have been developed by means of water quality data from 1999 to 2010 and the hybrid evolutionary algorithm HEA. Resulting models using all measured variables as inputs as well as models using electronically measurable variables only as inputs forecasted accurately timing of overgrowth of C. raciborskii and matched well high and low magnitudes of observed bloom events with 0.45≤r 2 >0.61 and 0.4≤r 2 >0.57, respectively. The models also revealed relationships and thresholds triggering bloom events that provide valuable information on synergism between water quality conditions and population dynamics of C. raciborskii. Best performing models based on using all measured variables as inputs indicated electrical conductivity (EC) within the range of 206-280mSm -1 as threshold above which fast growth and high abundances of C. raciborskii have been observed for the three lakes. Best models based on electronically measurable variables for the Lakes Wivenhoe and Somerset indicated a water temperature (WT) range of 25.5-32.7°C within which fast growth and high abundances of C. raciborskii can be expected. By contrast the model for Lake Samsonvale highlighted a turbidity (TURB) level of 4.8 NTU as indicator for mass developments of C. raciborskii. Experiments with online measured water quality data of the Lake Wivenhoe from 2007 to 2010 resulted in predictive models with 0.61≤r 2 >0.65 whereby again similar levels of EC and WT have been discovered as thresholds for outgrowth of C. raciborskii. The highest validity of r 2 =0.75 for an in situ data-based model has been achieved after considering time lags for EC by 7 days and dissolved oxygen by 1 day. These time lags have been discovered by a systematic screening of all possible combinations of time lags between 0 and 10 days for all electronically measurable variables. The so-developed model performs seven-day-ahead forecasts and is currently implemented and tested for early warning of C. raciborskii blooms in the Wivenhoe reservoir. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Application of evolutionary computation in ECAD problems

    NASA Astrophysics Data System (ADS)

    Lee, Dae-Hyun; Hwang, Seung H.

    1998-10-01

    Design of modern electronic system is a complicated task which demands the use of computer- aided design (CAD) tools. Since a lot of problems in ECAD are combinatorial optimization problems, evolutionary computations such as genetic algorithms and evolutionary programming have been widely employed to solve those problems. We have applied evolutionary computation techniques to solve ECAD problems such as technology mapping, microcode-bit optimization, data path ordering and peak power estimation, where their benefits are well observed. This paper presents experiences and discusses issues in those applications.

  14. Towards global patterns in the diversity and community structure of ectomycorrhizal fungi.

    PubMed

    Tedersoo, Leho; Bahram, Mohammad; Toots, Märt; Diédhiou, Abdala G; Henkel, Terry W; Kjøller, Rasmus; Morris, Melissa H; Nara, Kazuhide; Nouhra, Eduardo; Peay, Kabir G; Põlme, Sergei; Ryberg, Martin; Smith, Matthew E; Kõljalg, Urmas

    2012-09-01

    Global species richness patterns of soil micro-organisms remain poorly understood compared to macro-organisms. We use a global analysis to disentangle the global determinants of diversity and community composition for ectomycorrhizal (EcM) fungi-microbial symbionts that play key roles in plant nutrition in most temperate and many tropical forest ecosystems. Host plant family has the strongest effect on the phylogenetic community composition of fungi, whereas temperature and precipitation mostly affect EcM fungal richness that peaks in the temperate and boreal forest biomes, contrasting with latitudinal patterns of macro-organisms. Tropical ecosystems experience rapid turnover of organic material and have weak soil stratification, suggesting that poor habitat conditions may contribute to the relatively low richness of EcM fungi, and perhaps other soil biota, in most tropical ecosystems. For EcM fungi, greater evolutionary age and larger total area of EcM host vegetation may also contribute to the higher diversity in temperate ecosystems. Our results provide useful biogeographic and ecological hypotheses for explaining the distribution of fungi that remain to be tested by involving next-generation sequencing techniques and relevant soil metadata. © 2012 Blackwell Publishing Ltd.

  15. Computational Electromagnetic Modeling of SansEC(Trade Mark) Sensors

    NASA Technical Reports Server (NTRS)

    Smith, Laura J.; Dudley, Kenneth L.; Szatkowski, George N.

    2011-01-01

    This paper describes the preliminary effort to apply computational design tools to aid in the development of an electromagnetic SansEC resonant sensor composite materials damage detection system. The computational methods and models employed on this research problem will evolve in complexity over time and will lead to the development of new computational methods and experimental sensor systems that demonstrate the capability to detect, diagnose, and monitor the damage of composite materials and structures on aerospace vehicles.

  16. From evolutionary computation to the evolution of things.

    PubMed

    Eiben, Agoston E; Smith, Jim

    2015-05-28

    Evolution has provided a source of inspiration for algorithm designers since the birth of computers. The resulting field, evolutionary computation, has been successful in solving engineering tasks ranging in outlook from the molecular to the astronomical. Today, the field is entering a new phase as evolutionary algorithms that take place in hardware are developed, opening up new avenues towards autonomous machines that can adapt to their environment. We discuss how evolutionary computation compares with natural evolution and what its benefits are relative to other computing approaches, and we introduce the emerging area of artificial evolution in physical systems.

  17. BLT-EC (Breach, Leach and Transport-Equilibrium Chemistry) data input guide. A computer model for simulating release and coupled geochemical transport of contaminants from a subsurface disposal facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacKinnon, R.J.; Sullivan, T.M.; Kinsey, R.R.

    1997-05-01

    The BLT-EC computer code has been developed, implemented, and tested. BLT-EC is a two-dimensional finite element computer code capable of simulating the time-dependent release and reactive transport of aqueous phase species in a subsurface soil system. BLT-EC contains models to simulate the processes (container degradation, waste-form performance, transport, chemical reactions, and radioactive production and decay) most relevant to estimating the release and transport of contaminants from a subsurface disposal system. Water flow is provided through tabular input or auxiliary files. Container degradation considers localized failure due to pitting corrosion and general failure due to uniform surface degradation processes. Waste-form performancemore » considers release to be limited by one of four mechanisms: rinse with partitioning, diffusion, uniform surface degradation, and solubility. Transport considers the processes of advection, dispersion, diffusion, chemical reaction, radioactive production and decay, and sources (waste form releases). Chemical reactions accounted for include complexation, sorption, dissolution-precipitation, oxidation-reduction, and ion exchange. Radioactive production and decay in the waste form is simulated. To improve the usefulness of BLT-EC, a pre-processor, ECIN, which assists in the creation of chemistry input files, and a post-processor, BLTPLOT, which provides a visual display of the data have been developed. BLT-EC also includes an extensive database of thermodynamic data that is also accessible to ECIN. This document reviews the models implemented in BLT-EC and serves as a guide to creating input files and applying BLT-EC.« less

  18. Practical advantages of evolutionary computation

    NASA Astrophysics Data System (ADS)

    Fogel, David B.

    1997-10-01

    Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.

  19. Mechanistic and Evolutionary Insights from Comparative Enzymology of Phosphomonoesterases and Phosphodiesterases across the Alkaline Phosphatase Superfamily

    PubMed Central

    2016-01-01

    Naively one might have expected an early division between phosphate monoesterases and diesterases of the alkaline phosphatase (AP) superfamily. On the contrary, prior results and our structural and biochemical analyses of phosphate monoesterase PafA, from Chryseobacterium meningosepticum, indicate similarities to a superfamily phosphate diesterase [Xanthomonas citri nucleotide pyrophosphatase/phosphodiesterase (NPP)] and distinct differences from the three metal ion AP superfamily monoesterase, from Escherichia coli AP (EcAP). We carried out a series of experiments to map out and learn from the differences and similarities between these enzymes. First, we asked why there would be independent instances of monoesterases in the AP superfamily? PafA has a much weaker product inhibition and slightly higher activity relative to EcAP, suggesting that different metabolic evolutionary pressures favored distinct active-site architectures. Next, we addressed the preferential phosphate monoester and diester catalysis of PafA and NPP, respectively. We asked whether the >80% sequence differences throughout these scaffolds provide functional specialization for each enzyme’s cognate reaction. In contrast to expectations from this model, PafA and NPP mutants with the common subset of active-site groups embedded in each native scaffold had the same monoesterase:diesterase specificities; thus, the >107-fold difference in native specificities appears to arise from distinct interactions at a single phosphoryl substituent. We also uncovered striking mechanistic similarities between the PafA and EcAP monoesterases, including evidence for ground-state destabilization and functional active-site networks that involve different active-site groups but may play analogous catalytic roles. Discovering common network functions may reveal active-site architectural connections that are critical for function, and identifying regions of functional modularity may facilitate the design of new enzymes from existing promiscuous templates. More generally, comparative enzymology and analysis of catalytic promiscuity can provide mechanistic and evolutionary insights. PMID:27670607

  20. Hydride Transfer in DHFR by Transition Path Sampling, Kinetic Isotope Effects, and Heavy Enzyme Studies

    PubMed Central

    Wang, Zhen; Antoniou, Dimitri; Schwartz, Steven D.; Schramm, Vern L.

    2016-01-01

    Escherichia coli dihydrofolate reductase (ecDHFR) is used to study fundamental principles of enzyme catalysis. It remains controversial whether fast protein motions are coupled to the hydride transfer catalyzed by ecDHFR. Previous studies with heavy ecDHFR proteins labeled with 13C, 15N, and nonexchangeable 2H reported enzyme mass-dependent hydride transfer kinetics for ecDHFR. Here, we report refined experimental and computational studies to establish that hydride transfer is independent of protein mass. Instead, we found the rate constant for substrate dissociation to be faster for heavy DHFR. Previously reported kinetic differences between light and heavy DHFRs likely arise from kinetic steps other than the chemical step. This study confirms that fast (femtosecond to picosecond) protein motions in ecDHFR are not coupled to hydride transfer and provides an integrative computational and experimental approach to resolve fast dynamics coupled to chemical steps in enzyme catalysis. PMID:26652185

  1. Electronic cigarette, effective or harmful for quitting smoking and respiratory health: A quantitative review papers.

    PubMed

    Heydari, Gholamreza; Ahmady, Arezoo Ebn; Chamyani, Fahimeh; Masjedi, Mohammadreza; Fadaizadeh, Lida

    2017-01-01

    In recent years, electronic cigarettes (ECs) have been heavily advertised as an alternative smoking device as well as a possible cessation method. We aimed to review all published scientific literature pertaining to ECs and to present a simple conclusion about their effects for quitting smoking and respiratory health. This was a cross-sectional study with a search of PubMed, limited to English publications upto September 2014. The total number of papers which had ECs in its title and their conclusions positive or negative regarding ECs effects were computed. The number of negative papers was subtracted from the number of positive ones to make a score. Of the 149 articles, 137 (91.9%) were accessible, of which 68 did not have inclusion criteria. In the 69 remaining articles, 24 studies supported ECs and 45 considered these to be harmful. Finally, based on this evidence, the score of ECs (computed result with positive minus negative) was -21. Evidence to suggest that ECs may be effective and advisable for quitting smoking or a safe alternative for smoking is lacking and may instead harm the respiratory system. However, further studies are needed.

  2. Running climate model on a commercial cloud computing environment: A case study using Community Earth System Model (CESM) on Amazon AWS

    NASA Astrophysics Data System (ADS)

    Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock

    2017-01-01

    The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.

  3. Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    NASA Technical Reports Server (NTRS)

    Groves, Curtis; Ilie, Marcel; Schallhorn, Paul

    2014-01-01

    Spacecraft components may be damaged due to airflow produced by Environmental Control Systems (ECS). There are uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field around a spacecraft from the ECS System. This paper describes an approach to estimate the uncertainty in using CFD to predict the airflow speeds around an encapsulated spacecraft.

  4. Extension of the TDCR model to compute counting efficiencies for radionuclides with complex decay schemes.

    PubMed

    Kossert, K; Cassette, Ph; Carles, A Grau; Jörg, G; Gostomski, Christroph Lierse V; Nähle, O; Wolf, Ch

    2014-05-01

    The triple-to-double coincidence ratio (TDCR) method is frequently used to measure the activity of radionuclides decaying by pure β emission or electron capture (EC). Some radionuclides with more complex decays have also been studied, but accurate calculations of decay branches which are accompanied by many coincident γ transitions have not yet been investigated. This paper describes recent extensions of the model to make efficiency computations for more complex decay schemes possible. In particular, the MICELLE2 program that applies a stochastic approach of the free parameter model was extended. With an improved code, efficiencies for β(-), β(+) and EC branches with up to seven coincident γ transitions can be calculated. Moreover, a new parametrization for the computation of electron stopping powers has been implemented to compute the ionization quenching function of 10 commercial scintillation cocktails. In order to demonstrate the capabilities of the TDCR method, the following radionuclides are discussed: (166m)Ho (complex β(-)/γ), (59)Fe (complex β(-)/γ), (64)Cu (β(-), β(+), EC and EC/γ) and (229)Th in equilibrium with its progenies (decay chain with many α, β and complex β(-)/γ transitions). © 2013 Published by Elsevier Ltd.

  5. Effect of Ionic Diffusion on Extracellular Potentials in Neural Tissue

    PubMed Central

    Halnes, Geir; Mäki-Marttunen, Tuomo; Keller, Daniel; Pettersen, Klas H.; Andreassen, Ole A.

    2016-01-01

    Recorded potentials in the extracellular space (ECS) of the brain is a standard measure of population activity in neural tissue. Computational models that simulate the relationship between the ECS potential and its underlying neurophysiological processes are commonly used in the interpretation of such measurements. Standard methods, such as volume-conductor theory and current-source density theory, assume that diffusion has a negligible effect on the ECS potential, at least in the range of frequencies picked up by most recording systems. This assumption remains to be verified. We here present a hybrid simulation framework that accounts for diffusive effects on the ECS potential. The framework uses (1) the NEURON simulator to compute the activity and ionic output currents from multicompartmental neuron models, and (2) the electrodiffusive Kirchhoff-Nernst-Planck framework to simulate the resulting dynamics of the potential and ion concentrations in the ECS, accounting for the effect of electrical migration as well as diffusion. Using this framework, we explore the effect that ECS diffusion has on the electrical potential surrounding a small population of 10 pyramidal neurons. The neural model was tuned so that simulations over ∼100 seconds of biological time led to shifts in ECS concentrations by a few millimolars, similar to what has been seen in experiments. By comparing simulations where ECS diffusion was absent with simulations where ECS diffusion was included, we made the following key findings: (i) ECS diffusion shifted the local potential by up to ∼0.2 mV. (ii) The power spectral density (PSD) of the diffusion-evoked potential shifts followed a 1/f2 power law. (iii) Diffusion effects dominated the PSD of the ECS potential for frequencies up to several hertz. In scenarios with large, but physiologically realistic ECS concentration gradients, diffusion was thus found to affect the ECS potential well within the frequency range picked up in experimental recordings. PMID:27820827

  6. Effect of Ionic Diffusion on Extracellular Potentials in Neural Tissue.

    PubMed

    Halnes, Geir; Mäki-Marttunen, Tuomo; Keller, Daniel; Pettersen, Klas H; Andreassen, Ole A; Einevoll, Gaute T

    2016-11-01

    Recorded potentials in the extracellular space (ECS) of the brain is a standard measure of population activity in neural tissue. Computational models that simulate the relationship between the ECS potential and its underlying neurophysiological processes are commonly used in the interpretation of such measurements. Standard methods, such as volume-conductor theory and current-source density theory, assume that diffusion has a negligible effect on the ECS potential, at least in the range of frequencies picked up by most recording systems. This assumption remains to be verified. We here present a hybrid simulation framework that accounts for diffusive effects on the ECS potential. The framework uses (1) the NEURON simulator to compute the activity and ionic output currents from multicompartmental neuron models, and (2) the electrodiffusive Kirchhoff-Nernst-Planck framework to simulate the resulting dynamics of the potential and ion concentrations in the ECS, accounting for the effect of electrical migration as well as diffusion. Using this framework, we explore the effect that ECS diffusion has on the electrical potential surrounding a small population of 10 pyramidal neurons. The neural model was tuned so that simulations over ∼100 seconds of biological time led to shifts in ECS concentrations by a few millimolars, similar to what has been seen in experiments. By comparing simulations where ECS diffusion was absent with simulations where ECS diffusion was included, we made the following key findings: (i) ECS diffusion shifted the local potential by up to ∼0.2 mV. (ii) The power spectral density (PSD) of the diffusion-evoked potential shifts followed a 1/f2 power law. (iii) Diffusion effects dominated the PSD of the ECS potential for frequencies up to several hertz. In scenarios with large, but physiologically realistic ECS concentration gradients, diffusion was thus found to affect the ECS potential well within the frequency range picked up in experimental recordings.

  7. Near-infrared spectroscopy (NIRS)-based eyes-closed brain-computer interface (BCI) using prefrontal cortex activation due to mental arithmetic

    PubMed Central

    Shin, Jaeyoung; Müller, Klaus-R; Hwang, Han-Jeong

    2016-01-01

    We propose a near-infrared spectroscopy (NIRS)-based brain-computer interface (BCI) that can be operated in eyes-closed (EC) state. To evaluate the feasibility of NIRS-based EC BCIs, we compared the performance of an eye-open (EO) BCI paradigm and an EC BCI paradigm with respect to hemodynamic response and classification accuracy. To this end, subjects performed either mental arithmetic or imagined vocalization of the English alphabet as a baseline task with very low cognitive loading. The performances of two linear classifiers were compared; resulting in an advantage of shrinkage linear discriminant analysis (LDA). The classification accuracy of EC paradigm (75.6 ± 7.3%) was observed to be lower than that of EO paradigm (77.0 ± 9.2%), which was statistically insignificant (p = 0.5698). Subjects reported they felt it more comfortable (p = 0.057) and easier (p < 0.05) to perform the EC BCI tasks. The different task difficulty may become a cause of the slightly lower classification accuracy of EC data. From the analysis results, we could confirm the feasibility of NIRS-based EC BCIs, which can be a BCI option that may ultimately be of use for patients who cannot keep their eyes open consistently. PMID:27824089

  8. Near-infrared spectroscopy (NIRS)-based eyes-closed brain-computer interface (BCI) using prefrontal cortex activation due to mental arithmetic.

    PubMed

    Shin, Jaeyoung; Müller, Klaus-R; Hwang, Han-Jeong

    2016-11-08

    We propose a near-infrared spectroscopy (NIRS)-based brain-computer interface (BCI) that can be operated in eyes-closed (EC) state. To evaluate the feasibility of NIRS-based EC BCIs, we compared the performance of an eye-open (EO) BCI paradigm and an EC BCI paradigm with respect to hemodynamic response and classification accuracy. To this end, subjects performed either mental arithmetic or imagined vocalization of the English alphabet as a baseline task with very low cognitive loading. The performances of two linear classifiers were compared; resulting in an advantage of shrinkage linear discriminant analysis (LDA). The classification accuracy of EC paradigm (75.6 ± 7.3%) was observed to be lower than that of EO paradigm (77.0 ± 9.2%), which was statistically insignificant (p = 0.5698). Subjects reported they felt it more comfortable (p = 0.057) and easier (p < 0.05) to perform the EC BCI tasks. The different task difficulty may become a cause of the slightly lower classification accuracy of EC data. From the analysis results, we could confirm the feasibility of NIRS-based EC BCIs, which can be a BCI option that may ultimately be of use for patients who cannot keep their eyes open consistently.

  9. Generalized environmental control and life support system computer program (G189A) configuration control. [computer subroutine libraries for shuttle orbiter analyses

    NASA Technical Reports Server (NTRS)

    Blakely, R. L.

    1973-01-01

    A G189A simulation of the shuttle orbiter EC/lSS was prepared and used to study payload support capabilities. Two master program libraries of the G189A computer program were prepared for the NASA/JSC computer system. Several new component subroutines were added to the G189A program library and many existing subroutines were revised to improve their capabilities. A number of special analyses were performed in support of a NASA/JSC shuttle orbiter EC/LSS payload support capability study.

  10. Non-homologous isofunctional enzymes: a systematic analysis of alternative solutions in enzyme evolution.

    PubMed

    Omelchenko, Marina V; Galperin, Michael Y; Wolf, Yuri I; Koonin, Eugene V

    2010-04-30

    Evolutionarily unrelated proteins that catalyze the same biochemical reactions are often referred to as analogous - as opposed to homologous - enzymes. The existence of numerous alternative, non-homologous enzyme isoforms presents an interesting evolutionary problem; it also complicates genome-based reconstruction of the metabolic pathways in a variety of organisms. In 1998, a systematic search for analogous enzymes resulted in the identification of 105 Enzyme Commission (EC) numbers that included two or more proteins without detectable sequence similarity to each other, including 34 EC nodes where proteins were known (or predicted) to have distinct structural folds, indicating independent evolutionary origins. In the past 12 years, many putative non-homologous isofunctional enzymes were identified in newly sequenced genomes. In addition, efforts in structural genomics resulted in a vastly improved structural coverage of proteomes, providing for definitive assessment of (non)homologous relationships between proteins. We report the results of a comprehensive search for non-homologous isofunctional enzymes (NISE) that yielded 185 EC nodes with two or more experimentally characterized - or predicted - structurally unrelated proteins. Of these NISE sets, only 74 were from the original 1998 list. Structural assignments of the NISE show over-representation of proteins with the TIM barrel fold and the nucleotide-binding Rossmann fold. From the functional perspective, the set of NISE is enriched in hydrolases, particularly carbohydrate hydrolases, and in enzymes involved in defense against oxidative stress. These results indicate that at least some of the non-homologous isofunctional enzymes were recruited relatively recently from enzyme families that are active against related substrates and are sufficiently flexible to accommodate changes in substrate specificity.

  11. The Probiotic Escherichia coli Strain Nissle 1917 Combats Lambdoid Bacteriophages stx and λ.

    PubMed

    Bury, Susanne; Soundararajan, Manonmani; Bharti, Richa; von Bünau, Rudolf; Förstner, Konrad U; Oelschlaeger, Tobias A

    2018-01-01

    Shiga toxin (Stx) producing E. coli (STEC) such as Enterohemorrhagic E. coli (EHEC) are the major cause of foodborne illness in humans. In vitro studies showed the probiotic Escherichia coli strain Nissle 1917 (EcN) to efficiently inhibit the production of Stx. Life threatening EHEC strains as for example the serotype O104:H4, responsible for the great outbreak in 2011 in Germany, evolutionary developed from certain E. coli strains which got infected by stx2 -encoding lambdoid phages turning the E. coli into lysogenic and subsequently Stx producing strains. Since antibiotics induce stx genes and Stx production, EHEC infected persons are not recommended to be treated with antibiotics. Therefore, EcN might be an alternative medication. However, because even commensal E. coli strains might be converted into Stx-producers after becoming host to a stx encoding prophage, we tested EcN for stx -phage genome integration. Our experiments revealed the resistance of EcN toward not only stx -phages but also against lambda-phages. This resistance was not based on the lack of or by mutated phage receptors. Rather it involved the expression of a phage repressor ( pr ) gene of a defective prophage in EcN which was able to partially protect E. coli K-12 strain MG1655 against stx and lambda phage infection. Furthermore, we observed EcN to inactivate phages and thereby to protect E. coli K-12 strains against infection by stx - as well as lambda-phages. Inactivation of lambda-phages was due to binding of lambda-phages to LamB of EcN whereas inactivation of stx -phages was caused by a thermostable protein of EcN. These properties together with its ability to inhibit Stx production make EcN a good candidate for the prevention of illness caused by EHEC and probably for the treatment of already infected people.

  12. Exact coherent structures and chaotic dynamics in a model of cardiac tissue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byrne, Greg; Marcotte, Christopher D.; Grigoriev, Roman O., E-mail: roman.grigoriev@physics.gatech.edu

    Unstable nonchaotic solutions embedded in the chaotic attractor can provide significant new insight into chaotic dynamics of both low- and high-dimensional systems. In particular, in turbulent fluid flows, such unstable solutions are referred to as exact coherent structures (ECS) and play an important role in both initiating and sustaining turbulence. The nature of ECS and their role in organizing spatiotemporally chaotic dynamics, however, is reasonably well understood only for systems on relatively small spatial domains lacking continuous Euclidean symmetries. Construction of ECS on large domains and in the presence of continuous translational and/or rotational symmetries remains a challenge. This ismore » especially true for models of excitable media which display spiral turbulence and for which the standard approach to computing ECS completely breaks down. This paper uses the Karma model of cardiac tissue to illustrate a potential approach that could allow computing a new class of ECS on large domains of arbitrary shape by decomposing them into a patchwork of solutions on smaller domains, or tiles, which retain Euclidean symmetries locally.« less

  13. Post-Inhibitory Rebound Spikes in Rat Medial Entorhinal Layer II/III Principal Cells: In Vivo, In Vitro, and Computational Modeling Characterization

    PubMed Central

    Ferrante, Michele; Shay, Christopher F.; Tsuno, Yusuke; William Chapman, G.; Hasselmo, Michael E.

    2017-01-01

    Abstract Medial entorhinal cortex Layer-II stellate cells (mEC-LII-SCs) primarily interact via inhibitory interneurons. This suggests the presence of alternative mechanisms other than excitatory synaptic inputs for triggering action potentials (APs) in stellate cells during spatial navigation. Our intracellular recordings show that the hyperpolarization-activated cation current (Ih) allows post-inhibitory-rebound spikes (PIRS) in mEC-LII-SCs. In vivo, strong inhibitory-post-synaptic potentials immediately preceded most APs shortening their delay and enhancing excitability. In vitro experiments showed that inhibition initiated spikes more effectively than excitation and that more dorsal mEC-LII-SCs produced faster and more synchronous spikes. In contrast, PIRS in Layer-II/III pyramidal cells were harder to evoke, voltage-independent, and slower in dorsal mEC. In computational simulations, mEC-LII-SCs morphology and Ih homeostatically regulated the dorso-ventral differences in PIRS timing and most dendrites generated PIRS with a narrow range of stimulus amplitudes. These results suggest inhibitory inputs could mediate the emergence of grid cell firing in a neuronal network. PMID:26965902

  14. Computational Study of the Bulk Properties of a Novel Molecule: alpha-Tocopherol-Ascorbic Acid Surfactant

    NASA Astrophysics Data System (ADS)

    Stirling, Shannon; Kim, Hye-Young

    Alpha-tocopherol-ascorbic acid surfactant (EC) is a novel amphiphilic molecule of antioxidant properties, which has a hydrophobic vitamin E and a hydrophilic vitamin C chemically linked. We have developed atomistic force fields (g54a7) for a protonated (neutral) EC molecule. Our goal is to carry out molecular dynamics (MD) simulations of protonated EC molecules using the newly developed force fields and study the molecular properties. First we ran energy minimization (EM) with one molecule in a vacuum to obtain the low energy molecular configuration with emtol =10. We then used Packmol to insert 125 EC molecules in a 3nm cube. We then performed MD simulations of the bulk system composed of 125 EC molecules, from which we measured the bulk density and the evaporation energy of the molecular system. Gromacs2016 is used for the EM and MD simulation studies. We will present the results of the ongoing research. National Institute Of General Medical Sciences of the National Institutes of Health under Award Number P20GM103424 (Kim). Computational resources were provided by the Louisiana Optical Network Initiative.

  15. Understanding the function of bacterial and eukaryotic thiolases II by integrating evolutionary and functional approaches.

    PubMed

    Fox, Ana Romina; Soto, Gabriela; Mozzicafreddo, Matteo; Garcia, Araceli Nora; Cuccioloni, Massimiliano; Angeletti, Mauro; Salerno, Juan Carlos; Ayub, Nicolás Daniel

    2014-01-01

    Acetoacetyl-CoA thiolase (EC 2.3.1.9), commonly named thiolase II, condenses two molecules of acetyl-CoA to give acetoacetyl-CoA and CoA. This enzyme acts in anabolic processes as the first step in the biosynthesis of isoprenoids and polyhydroxybutyrate in eukaryotes and bacteria, respectively. We have recently reported the evolutionary and functional equivalence of these enzymes, suggesting that thiolase II could be the rate limiting enzyme in these pathways and presented evidence indicating that this enzyme modulates the availability of reducing equivalents during abiotic stress adaptation in bacteria and plants. However, these results are not sufficient to clarify why thiolase II was evolutionary selected as a critical enzyme in the production of antioxidant compounds. Regarding this intriguing topic, we propose that thiolase II could sense changes in the acetyl-CoA/CoA ratio induced by the inhibition of the tricarboxylic acid cycle under abiotic stress. Thus, the high level of evolutionary and functional constraint of thiolase II may be due to the connection of this enzyme with an ancient and conserved metabolic route. © 2013.

  16. Advances in computer simulation of genome evolution: toward more realistic evolutionary genomics analysis by approximate bayesian computation.

    PubMed

    Arenas, Miguel

    2015-04-01

    NGS technologies present a fast and cheap generation of genomic data. Nevertheless, ancestral genome inference is not so straightforward due to complex evolutionary processes acting on this material such as inversions, translocations, and other genome rearrangements that, in addition to their implicit complexity, can co-occur and confound ancestral inferences. Recently, models of genome evolution that accommodate such complex genomic events are emerging. This letter explores these novel evolutionary models and proposes their incorporation into robust statistical approaches based on computer simulations, such as approximate Bayesian computation, that may produce a more realistic evolutionary analysis of genomic data. Advantages and pitfalls in using these analytical methods are discussed. Potential applications of these ancestral genomic inferences are also pointed out.

  17. 10 CFR Appendix A to Subpart C of... - Sampling Plan for Enforcement Testing of Covered Consumer Products and Certain High-Volume...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... in Step (c). (6) For an energy or water consumption standard (ECS), compute the upper control limit (UCL2) for the mean of the combined first and second samples using the DOE ECS as the desired mean and a...)(1). (7) For an energy or water consumption standard (ECS), compare the combined sample mean (x2) to...

  18. 10 CFR Appendix A to Subpart C of... - Sampling Plan for Enforcement Testing of Covered Consumer Products and Certain High-Volume...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... in Step (c). (6) For an energy or water consumption standard (ECS), compute the upper control limit (UCL2) for the mean of the combined first and second samples using the DOE ECS as the desired mean and a...)(1). (7) For an energy or water consumption standard (ECS), compare the combined sample mean (x2) to...

  19. 10 CFR Appendix A to Subpart C of... - Sampling Plan for Enforcement Testing of Covered Consumer Products and Certain High-Volume...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... in Step (c). (6) For an energy or water consumption standard (ECS), compute the upper control limit (UCL2) for the mean of the combined first and second samples using the DOE ECS as the desired mean and a...)(1). (7) For an energy or water consumption standard (ECS), compare the combined sample mean (x2) to...

  20. Evolutionary computation in zoology and ecology.

    PubMed

    Boone, Randall B

    2017-12-01

    Evolutionary computational methods have adopted attributes of natural selection and evolution to solve problems in computer science, engineering, and other fields. The method is growing in use in zoology and ecology. Evolutionary principles may be merged with an agent-based modeling perspective to have individual animals or other agents compete. Four main categories are discussed: genetic algorithms, evolutionary programming, genetic programming, and evolutionary strategies. In evolutionary computation, a population is represented in a way that allows for an objective function to be assessed that is relevant to the problem of interest. The poorest performing members are removed from the population, and remaining members reproduce and may be mutated. The fitness of the members is again assessed, and the cycle continues until a stopping condition is met. Case studies include optimizing: egg shape given different clutch sizes, mate selection, migration of wildebeest, birds, and elk, vulture foraging behavior, algal bloom prediction, and species richness given energy constraints. Other case studies simulate the evolution of species and a means to project shifts in species ranges in response to a changing climate that includes competition and phenotypic plasticity. This introduction concludes by citing other uses of evolutionary computation and a review of the flexibility of the methods. For example, representing species' niche spaces subject to selective pressure allows studies on cladistics, the taxon cycle, neutral versus niche paradigms, fundamental versus realized niches, community structure and order of colonization, invasiveness, and responses to a changing climate.

  1. Evolutionary computation in zoology and ecology

    PubMed Central

    2017-01-01

    Abstract Evolutionary computational methods have adopted attributes of natural selection and evolution to solve problems in computer science, engineering, and other fields. The method is growing in use in zoology and ecology. Evolutionary principles may be merged with an agent-based modeling perspective to have individual animals or other agents compete. Four main categories are discussed: genetic algorithms, evolutionary programming, genetic programming, and evolutionary strategies. In evolutionary computation, a population is represented in a way that allows for an objective function to be assessed that is relevant to the problem of interest. The poorest performing members are removed from the population, and remaining members reproduce and may be mutated. The fitness of the members is again assessed, and the cycle continues until a stopping condition is met. Case studies include optimizing: egg shape given different clutch sizes, mate selection, migration of wildebeest, birds, and elk, vulture foraging behavior, algal bloom prediction, and species richness given energy constraints. Other case studies simulate the evolution of species and a means to project shifts in species ranges in response to a changing climate that includes competition and phenotypic plasticity. This introduction concludes by citing other uses of evolutionary computation and a review of the flexibility of the methods. For example, representing species’ niche spaces subject to selective pressure allows studies on cladistics, the taxon cycle, neutral versus niche paradigms, fundamental versus realized niches, community structure and order of colonization, invasiveness, and responses to a changing climate. PMID:29492029

  2. It takes a village: supporting inquiry- and equity-oriented computer science pedagogy through a professional learning community

    NASA Astrophysics Data System (ADS)

    Ryoo, Jean; Goode, Joanna; Margolis, Jane

    2015-10-01

    This article describes the importance that high school computer science teachers place on a teachers' professional learning community designed around an inquiry- and equity-oriented approach for broadening participation in computing. Using grounded theory to analyze four years of teacher surveys and interviews from the Exploring Computer Science (ECS) program in the Los Angeles Unified School District, this article describes how participating in professional development activities purposefully aimed at fostering a teachers' professional learning community helps ECS teachers make the transition to an inquiry-based classroom culture and break professional isolation. This professional learning community also provides experiences that challenge prevalent deficit notions and stereotypes about which students can or cannot excel in computer science.

  3. cloudPEST - A python module for cloud-computing deployment of PEST, a program for parameter estimation

    USGS Publications Warehouse

    Fienen, Michael N.; Kunicki, Thomas C.; Kester, Daniel E.

    2011-01-01

    This report documents cloudPEST-a Python module with functions to facilitate deployment of the model-independent parameter estimation code PEST on a cloud-computing environment. cloudPEST makes use of low-level, freely available command-line tools that interface with the Amazon Elastic Compute Cloud (EC2(TradeMark)) that are unlikely to change dramatically. This report describes the preliminary setup for both Python and EC2 tools and subsequently describes the functions themselves. The code and guidelines have been tested primarily on the Windows(Registered) operating system but are extensible to Linux(Registered).

  4. From computers to cultivation: reconceptualizing evolutionary psychology.

    PubMed

    Barrett, Louise; Pollet, Thomas V; Stulp, Gert

    2014-01-01

    Does evolutionary theorizing have a role in psychology? This is a more contentious issue than one might imagine, given that, as evolved creatures, the answer must surely be yes. The contested nature of evolutionary psychology lies not in our status as evolved beings, but in the extent to which evolutionary ideas add value to studies of human behavior, and the rigor with which these ideas are tested. This, in turn, is linked to the framework in which particular evolutionary ideas are situated. While the framing of the current research topic places the brain-as-computer metaphor in opposition to evolutionary psychology, the most prominent school of thought in this field (born out of cognitive psychology, and often known as the Santa Barbara school) is entirely wedded to the computational theory of mind as an explanatory framework. Its unique aspect is to argue that the mind consists of a large number of functionally specialized (i.e., domain-specific) computational mechanisms, or modules (the massive modularity hypothesis). Far from offering an alternative to, or an improvement on, the current perspective, we argue that evolutionary psychology is a mainstream computational theory, and that its arguments for domain-specificity often rest on shaky premises. We then go on to suggest that the various forms of e-cognition (i.e., embodied, embedded, enactive) represent a true alternative to standard computational approaches, with an emphasis on "cognitive integration" or the "extended mind hypothesis" in particular. We feel this offers the most promise for human psychology because it incorporates the social and historical processes that are crucial to human "mind-making" within an evolutionarily informed framework. In addition to linking to other research areas in psychology, this approach is more likely to form productive links to other disciplines within the social sciences, not least by encouraging a healthy pluralism in approach.

  5. WQEP - a computer spreadsheet program to evaluate water quality data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liddle, R.G.

    1996-12-31

    A flexible spreadsheet Water Quality Evaluation Program (WQEP) has been developed for mining companies, consultants, and regulators to interpret the results of water quality sampling. In order properly to evaluate hydrologic data, unit conversions and chemical calculations are done, quality control checks are needed, and a complete and up-to-date listing of water quality standards is necessary. This process is time consuming and tends not to be done for every sample. This program speeds the process by allowing the input of up to 115 chemical parameters from one sample. WQEP compares concentrations with EPA primary and secondary drinking water MCLs ormore » MCLG, EPA warmwater and Coldwater acute and chronic aquatic life criteria, irrigation criteria, livestock criteria, EPA human health criteria, and several other categories of criteria. The spreadsheet allows the input of State or local water standards of interest. Water quality checks include: anion/cations, TDS{sub m}/TDS{sub c} (where m=measured and c=calculated), EC{sub m}/EC{sub c}, EC{sub m}/ion sums, TDS{sub c}/EC ratio, TDS{sub m}/EC, EC vs. alkalinity, two hardness values, and EC vs. {Sigma} cations. WQEP computes the dissolved transport index of 23 parameters, computes ratios of 26 species for trend analysis, calculates non-carbonate alkalinity to adjust the bicarbonate concentration, and calculates 35 interpretive formulas (pE, SAR, S.I., unionized ammonia, ionized sulfide HS-, pK{sub x} values, etc.). Fingerprinting is conducted by automatic generation of stiff diagrams and ion histograms. Mass loading calculations, mass balance calculations, conversions of concentrations, ionic strength, and the activity coefficient and chemical activity of 33 parameters is calculated. This program allows a speedy and thorough evaluation of water quality data from metal mines, coal mining, and natural surface water systems and has been tested against hand calculations.« less

  6. Research on distributed virtual reality system in electronic commerce

    NASA Astrophysics Data System (ADS)

    Xue, Qiang; Wang, Jiening; Sun, Jizhou

    2004-03-01

    In this paper, Distributed Virtual Reality (DVR) technology applied in Electronical Commerce (EC) is discussed. DVR has the capability of providing a new means for human being to recognize, analyze and resolve the large scale, complex problems, which makes it develop quickly in EC fields. The technology of CSCW (Computer Supported Cooperative Work) and middleware is introduced into the development of EC-DVR system to meet the need of a platform which can provide the necessary cooperation and communication services to avoid developing the basic module repeatedly. Finally, the paper gives a platform structure of EC-DVR system.

  7. Data Sharing and Scientific Impact in Eddy Covariance Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bond-Lamberty, B.

    Do the benefits of data sharing outweigh its perceived costs? This is a critical question, and one with the potential to change culture and behavior. Dai et al. (2018) examine how data sharing is related to scientific impact in the field of eddy covariance (EC), and find that data sharers are disproportionately high-impact researchers, and vice versa; they also note strong regional differences in EC data sharing norms. The current policies and restrictions of EC journals and repositories are highly uneven. Incentivizing data sharing and enhancing computational reproducibility are critical next steps for EC, ecology, and science more broadly.

  8. Relational Programming.

    DTIC Science & Technology

    1983-09-01

    be illustrated by example. If ’z’ is the name of an individual and ’C’ is the name of a class (set), then ’ zEC ’ means that the individual denoted by ’z...will abbreviate this un z. Conversely, if C is a single element class, then un-1 C selects the unique member of that class: un-1C = Lz( zEC ). It is...Professor Peter Henderson1 Department of Computer Science SUNY at Stony Brook Long Island, NY 11794 Dr. Olle Olsson Department of Computer Science

  9. Exploiting Parallel R in the Cloud with SPRINT

    PubMed Central

    Piotrowski, M.; McGilvary, G.A.; Sloan, T. M.; Mewissen, M.; Lloyd, A.D.; Forster, T.; Mitchell, L.; Ghazal, P.; Hill, J.

    2012-01-01

    Background Advances in DNA Microarray devices and next-generation massively parallel DNA sequencing platforms have led to an exponential growth in data availability but the arising opportunities require adequate computing resources. High Performance Computing (HPC) in the Cloud offers an affordable way of meeting this need. Objectives Bioconductor, a popular tool for high-throughput genomic data analysis, is distributed as add-on modules for the R statistical programming language but R has no native capabilities for exploiting multi-processor architectures. SPRINT is an R package that enables easy access to HPC for genomics researchers. This paper investigates: setting up and running SPRINT-enabled genomic analyses on Amazon’s Elastic Compute Cloud (EC2), the advantages of submitting applications to EC2 from different parts of the world and, if resource underutilization can improve application performance. Methods The SPRINT parallel implementations of correlation, permutation testing, partitioning around medoids and the multi-purpose papply have been benchmarked on data sets of various size on Amazon EC2. Jobs have been submitted from both the UK and Thailand to investigate monetary differences. Results It is possible to obtain good, scalable performance but the level of improvement is dependent upon the nature of algorithm. Resource underutilization can further improve the time to result. End-user’s location impacts on costs due to factors such as local taxation. Conclusions: Although not designed to satisfy HPC requirements, Amazon EC2 and cloud computing in general provides an interesting alternative and provides new possibilities for smaller organisations with limited funds. PMID:23223611

  10. Exploiting parallel R in the cloud with SPRINT.

    PubMed

    Piotrowski, M; McGilvary, G A; Sloan, T M; Mewissen, M; Lloyd, A D; Forster, T; Mitchell, L; Ghazal, P; Hill, J

    2013-01-01

    Advances in DNA Microarray devices and next-generation massively parallel DNA sequencing platforms have led to an exponential growth in data availability but the arising opportunities require adequate computing resources. High Performance Computing (HPC) in the Cloud offers an affordable way of meeting this need. Bioconductor, a popular tool for high-throughput genomic data analysis, is distributed as add-on modules for the R statistical programming language but R has no native capabilities for exploiting multi-processor architectures. SPRINT is an R package that enables easy access to HPC for genomics researchers. This paper investigates: setting up and running SPRINT-enabled genomic analyses on Amazon's Elastic Compute Cloud (EC2), the advantages of submitting applications to EC2 from different parts of the world and, if resource underutilization can improve application performance. The SPRINT parallel implementations of correlation, permutation testing, partitioning around medoids and the multi-purpose papply have been benchmarked on data sets of various size on Amazon EC2. Jobs have been submitted from both the UK and Thailand to investigate monetary differences. It is possible to obtain good, scalable performance but the level of improvement is dependent upon the nature of the algorithm. Resource underutilization can further improve the time to result. End-user's location impacts on costs due to factors such as local taxation. Although not designed to satisfy HPC requirements, Amazon EC2 and cloud computing in general provides an interesting alternative and provides new possibilities for smaller organisations with limited funds.

  11. ICECAP: an integrated, general-purpose, automation-assisted IC50/EC50 assay platform.

    PubMed

    Li, Ming; Chou, Judy; King, Kristopher W; Jing, Jing; Wei, Dong; Yang, Liyu

    2015-02-01

    IC50 and EC50 values are commonly used to evaluate drug potency. Mass spectrometry (MS)-centric bioanalytical and biomarker labs are now conducting IC50/EC50 assays, which, if done manually, are tedious and error-prone. Existing bioanalytical sample preparation automation systems cannot meet IC50/EC50 assay throughput demand. A general-purpose, automation-assisted IC50/EC50 assay platform was developed to automate the calculations of spiking solutions and the matrix solutions preparation scheme, the actual spiking and matrix solutions preparations, as well as the flexible sample extraction procedures after incubation. In addition, the platform also automates the data extraction, nonlinear regression curve fitting, computation of IC50/EC50 values, graphing, and reporting. The automation-assisted IC50/EC50 assay platform can process the whole class of assays of varying assay conditions. In each run, the system can handle up to 32 compounds and up to 10 concentration levels per compound, and it greatly improves IC50/EC50 assay experimental productivity and data processing efficiency. © 2014 Society for Laboratory Automation and Screening.

  12. Uncover the Cloud for Geospatial Sciences and Applications to Adopt Cloud Computing

    NASA Astrophysics Data System (ADS)

    Yang, C.; Huang, Q.; Xia, J.; Liu, K.; Li, J.; Xu, C.; Sun, M.; Bambacus, M.; Xu, Y.; Fay, D.

    2012-12-01

    Cloud computing is emerging as the future infrastructure for providing computing resources to support and enable scientific research, engineering development, and application construction, as well as work force education. On the other hand, there is a lot of doubt about the readiness of cloud computing to support a variety of scientific research, development and educations. This research is a project funded by NASA SMD to investigate through holistic studies how ready is the cloud computing to support geosciences. Four applications with different computing characteristics including data, computing, concurrent, and spatiotemporal intensities are taken to test the readiness of cloud computing to support geosciences. Three popular and representative cloud platforms including Amazon EC2, Microsoft Azure, and NASA Nebula as well as a traditional cluster are utilized in the study. Results illustrates that cloud is ready to some degree but more research needs to be done to fully implemented the cloud benefit as advertised by many vendors and defined by NIST. Specifically, 1) most cloud platform could help stand up new computing instances, a new computer, in a few minutes as envisioned, therefore, is ready to support most computing needs in an on demand fashion; 2) the load balance and elasticity, a defining characteristic, is ready in some cloud platforms, such as Amazon EC2, to support bigger jobs, e.g., needs response in minutes, while some are not ready to support the elasticity and load balance well. All cloud platform needs further research and development to support real time application at subminute level; 3) the user interface and functionality of cloud platforms vary a lot and some of them are very professional and well supported/documented, such as Amazon EC2, some of them needs significant improvement for the general public to adopt cloud computing without professional training or knowledge about computing infrastructure; 4) the security is a big concern in cloud computing platform, with the sharing spirit of cloud computing, it is very hard to ensure higher level security, except a private cloud is built for a specific organization without public access, public cloud platform does not support FISMA medium level yet and may never be able to support FISMA high level; 5) HPC jobs needs of cloud computing is not well supported and only Amazon EC2 supports this well. The research is being taken by NASA and other agencies to consider cloud computing adoption. We hope the publication of the research would also benefit the public to adopt cloud computing.

  13. From computers to cultivation: reconceptualizing evolutionary psychology

    PubMed Central

    Barrett, Louise; Pollet, Thomas V.; Stulp, Gert

    2014-01-01

    Does evolutionary theorizing have a role in psychology? This is a more contentious issue than one might imagine, given that, as evolved creatures, the answer must surely be yes. The contested nature of evolutionary psychology lies not in our status as evolved beings, but in the extent to which evolutionary ideas add value to studies of human behavior, and the rigor with which these ideas are tested. This, in turn, is linked to the framework in which particular evolutionary ideas are situated. While the framing of the current research topic places the brain-as-computer metaphor in opposition to evolutionary psychology, the most prominent school of thought in this field (born out of cognitive psychology, and often known as the Santa Barbara school) is entirely wedded to the computational theory of mind as an explanatory framework. Its unique aspect is to argue that the mind consists of a large number of functionally specialized (i.e., domain-specific) computational mechanisms, or modules (the massive modularity hypothesis). Far from offering an alternative to, or an improvement on, the current perspective, we argue that evolutionary psychology is a mainstream computational theory, and that its arguments for domain-specificity often rest on shaky premises. We then go on to suggest that the various forms of e-cognition (i.e., embodied, embedded, enactive) represent a true alternative to standard computational approaches, with an emphasis on “cognitive integration” or the “extended mind hypothesis” in particular. We feel this offers the most promise for human psychology because it incorporates the social and historical processes that are crucial to human “mind-making” within an evolutionarily informed framework. In addition to linking to other research areas in psychology, this approach is more likely to form productive links to other disciplines within the social sciences, not least by encouraging a healthy pluralism in approach. PMID:25161633

  14. Landscape evolution and agricultural land salinization in coastal area: A conceptual model.

    PubMed

    Bless, Aplena Elen; Colin, François; Crabit, Armand; Devaux, Nicolas; Philippon, Olivier; Follain, Stéphane

    2018-06-01

    Soil salinization is a major threat to agricultural lands. Among salt-affected lands, coastal areas could be considered as highly complex systems, where salinization degradation due to anthropogenic pressure and climate-induced changes could significantly alter system functioning. For such complex systems, conceptual models can be used as evaluation tools in a preliminary step to identify the main evolutionary processes responsible for soil and water salinization. This study aimed to propose a conceptual model for water fluxes in a coastal area affected by salinity, which can help to identify the relationships between agricultural landscape evolution and actual salinity. First, we conducted field investigations from 2012 to 2016, mainly based on both soil (EC 1/5 ) and water (EC w ) electrical conductivity survey. This allowed us to characterize spatial structures for EC 1/5 and EC w and to identify the river as a preponderant factor in land salinization. Subsequently, we proposed and used a conceptual model for water fluxes and conducted a time analysis (1962-2012) for three of its main constitutive elements, namely climate, river, and land systems. When integrated within the conceptual model framework, it appeared that the evolution of all constitutive elements since 1962 was responsible for the disruption of system equilibrium, favoring overall salt accumulation in the soil root zone. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Energy-Constrained Recharge, Assimilation, and Fractional Crystallization (EC-RAχFC): A Visual Basic computer code for calculating trace element and isotope variations of open-system magmatic systems

    NASA Astrophysics Data System (ADS)

    Bohrson, Wendy A.; Spera, Frank J.

    2007-11-01

    Volcanic and plutonic rocks provide abundant evidence for complex processes that occur in magma storage and transport systems. The fingerprint of these processes, which include fractional crystallization, assimilation, and magma recharge, is captured in petrologic and geochemical characteristics of suites of cogenetic rocks. Quantitatively evaluating the relative contributions of each process requires integration of mass, species, and energy constraints, applied in a self-consistent way. The energy-constrained model Energy-Constrained Recharge, Assimilation, and Fractional Crystallization (EC-RaχFC) tracks the trace element and isotopic evolution of a magmatic system (melt + solids) undergoing simultaneous fractional crystallization, recharge, and assimilation. Mass, thermal, and compositional (trace element and isotope) output is provided for melt in the magma body, cumulates, enclaves, and anatectic (i.e., country rock) melt. Theory of the EC computational method has been presented by Spera and Bohrson (2001, 2002, 2004), and applications to natural systems have been elucidated by Bohrson and Spera (2001, 2003) and Fowler et al. (2004). The purpose of this contribution is to make the final version of the EC-RAχFC computer code available and to provide instructions for code implementation, description of input and output parameters, and estimates of typical values for some input parameters. A brief discussion highlights measures by which the user may evaluate the quality of the output and also provides some guidelines for implementing nonlinear productivity functions. The EC-RAχFC computer code is written in Visual Basic, the programming language of Excel. The code therefore launches in Excel and is compatible with both PC and MAC platforms. The code is available on the authors' Web sites http://magma.geol.ucsb.edu/and http://www.geology.cwu.edu/ecrafc) as well as in the auxiliary material.

  16. Bio-inspired algorithms applied to molecular docking simulations.

    PubMed

    Heberlé, G; de Azevedo, W F

    2011-01-01

    Nature as a source of inspiration has been shown to have a great beneficial impact on the development of new computational methodologies. In this scenario, analyses of the interactions between a protein target and a ligand can be simulated by biologically inspired algorithms (BIAs). These algorithms mimic biological systems to create new paradigms for computation, such as neural networks, evolutionary computing, and swarm intelligence. This review provides a description of the main concepts behind BIAs applied to molecular docking simulations. Special attention is devoted to evolutionary algorithms, guided-directed evolutionary algorithms, and Lamarckian genetic algorithms. Recent applications of these methodologies to protein targets identified in the Mycobacterium tuberculosis genome are described.

  17. [The history of development of evolutionary methods in St. Petersburg school of computer simulation in biology].

    PubMed

    Menshutkin, V V; Kazanskiĭ, A B; Levchenko, V F

    2010-01-01

    The history of rise and development of evolutionary methods in Saint Petersburg school of biological modelling is traced and analyzed. Some pioneering works in simulation of ecological and evolutionary processes, performed in St.-Petersburg school became an exemplary ones for many followers in Russia and abroad. The individual-based approach became the crucial point in the history of the school as an adequate instrument for construction of models of biological evolution. This approach is natural for simulation of the evolution of life-history parameters and adaptive processes in populations and communities. In some cases simulated evolutionary process was used for solving a reverse problem, i. e., for estimation of uncertain life-history parameters of population. Evolutionary computations is one more aspect of this approach application in great many fields. The problems and vistas of ecological and evolutionary modelling in general are discussed.

  18. Open Circuit Resonant (SansEC) Sensor Technology for Lightning Mitigation and Damage Detection and Diagnosis for Composite Aircraft Applications

    NASA Technical Reports Server (NTRS)

    Szatkowski, George N.; Dudley, Kenneth L.; Smith, Laura J.; Wang, Chuantong; Ticatch, Larry A.

    2014-01-01

    Traditional methods to protect composite aircraft from lightning strike damage rely on a conductive layer embedded on or within the surface of the aircraft composite skin. This method is effective at preventing major direct effect damage and minimizes indirect effects to aircraft systems from lightning strike attachment, but provides no additional benefit for the added parasitic weight from the conductive layer. When a known lightning strike occurs, the points of attachment and detachment on the aircraft surface are visually inspected and checked for damage by maintenance personnel to ensure continued safe flight operations. A new multi-functional lightning strike protection (LSP) method has been developed to provide aircraft lightning strike protection, damage detection and diagnosis for composite aircraft surfaces. The method incorporates a SansEC sensor array on the aircraft exterior surfaces forming a "Smart skin" surface for aircraft lightning zones certified to withstand strikes up to 100 kiloamperes peak current. SansEC sensors are open-circuit devices comprised of conductive trace spiral patterns sans (without) electrical connections. The SansEC sensor is an electromagnetic resonator having specific resonant parameters (frequency, amplitude, bandwidth & phase) which when electromagnetically coupled with a composite substrate will indicate the electrical impedance of the composite through a change in its resonant response. Any measureable shift in the resonant characteristics can be an indication of damage to the composite caused by a lightning strike or from other means. The SansEC sensor method is intended to diagnose damage for both in-situ health monitoring or ground inspections. In this paper, the theoretical mathematical framework is established for the use of open circuit sensors to perform damage detection and diagnosis on carbon fiber composites. Both computational and experimental analyses were conducted to validate this new method and system for aircraft composite damage detection and diagnosis. Experimental test results on seeded fault damage coupons and computational modeling simulation results are presented. This paper also presents the shielding effectiveness along with the lightning direct effect test results from several different SansEC LSP and baseline protected and unprotected carbon fiber reinforced polymer (CFRP) test panels struck at 40 and 100 kiloamperes following a universal common practice test procedure to enable damage comparisons between SansEC LSP configurations and common practice copper mesh LSP approaches. The SansEC test panels were mounted in a LSP test bed during the lightning test. Electrical, mechanical and thermal parameters were measured during lightning attachment and are presented with post test nondestructive inspection comparisons. The paper provides correlational results between the SansEC sensors computed electric field distribution and the location of the lightning attachment on the sensor trace and visual observations showing the SansEC sensor's affinity for dispersing the lightning attachment.

  19. Noise Effects on Entangled Coherent State Generated via Atom-Field Interaction and Beam Splitter

    NASA Astrophysics Data System (ADS)

    Najarbashi, G.; Mirzaei, S.

    2016-05-01

    In this paper, we introduce a controllable method for producing two and three-mode entangled coherent states (ECS's) using atom-field interaction in cavity QED and beam splitter. The generated states play central roles in linear optics, quantum computation and teleportation. We especially focus on qubit, qutrit and qufit like ECS's and investigate their entanglement by concurrence measure. Moreover, we illustrate decoherence properties of ECS's due to noisy channels, using negativity measure. At the end the effect of noise on monogamy inequality is discussed.

  20. A Parallel-Plate Flow Chamber for Mechanical Characterization of Endothelial Cells Exposed to Laminar Shear Stress

    PubMed Central

    Wong, Andrew K.; LLanos, Pierre; Boroda, Nickolas; Rosenberg, Seth R.; Rabbany, Sina Y.

    2017-01-01

    Shear stresses induced by laminar fluid flow are essential to properly recapitulate the physiological microenvironment experienced by endothelial cells (ECs). ECs respond to these stresses via mechanotransduction by modulating their phenotype and biomechanical characteristics, which can be characterized by Atomic Force Microscopy (AFM). Parallel Plate Flow Chambers (PPFCs) apply unidirectional laminar fluid flow to EC monolayers in vitro. Since ECs in sealed PPFCs are inaccessible to AFM probes, cone-and-plate viscometers (CPs) are commonly used to apply shear stress. This paper presents a comparison of the efficacies of both methods. Computational Fluid Dynamic simulation and validation testing using EC responses as a metric have indicated limitations in the use of CPs to apply laminar shear stress. Monolayers subjected to laminar fluid flow in a PPFC respond by increasing cortical stiffness, elongating, and aligning filamentous actin in the direction of fluid flow to a greater extent than CP devices. Limitations using CP devices to provide laminar flow across an EC monolayer suggest they are better suited when studying EC response for disturbed flow conditions. PPFC platforms allow for exposure of ECs to laminar fluid flow conditions, recapitulating cellular biomechanical behaviors, whereas CP platforms allow for mechanical characterization of ECs under secondary flow. PMID:28989541

  1. Knowledge Guided Evolutionary Algorithms in Financial Investing

    ERIC Educational Resources Information Center

    Wimmer, Hayden

    2013-01-01

    A large body of literature exists on evolutionary computing, genetic algorithms, decision trees, codified knowledge, and knowledge management systems; however, the intersection of these computing topics has not been widely researched. Moving through the set of all possible solutions--or traversing the search space--at random exhibits no control…

  2. Automated design of spacecraft systems power subsystems

    NASA Technical Reports Server (NTRS)

    Terrile, Richard J.; Kordon, Mark; Mandutianu, Dan; Salcedo, Jose; Wood, Eric; Hashemi, Mona

    2006-01-01

    This paper discusses the application of evolutionary computing to a dynamic space vehicle power subsystem resource and performance simulation in a parallel processing environment. Our objective is to demonstrate the feasibility, application and advantage of using evolutionary computation techniques for the early design search and optimization of space systems.

  3. Integrating Marine Observatories into a System-of-Systems: Messaging in the US Ocean Observatories Initiative

    DTIC Science & Technology

    2010-06-01

    Woods Hole, MA 02543, USA 3 Raytheon Intelligence and Information Systems, Aurora , CO 80011, USA 4 Scripps Institution of Oceanography, La Jolla...Amazon.com, Amazon Web Services for the Amazon Elastic Compute Cloud ( Amazon EC2). http://aws.amazon.com/ec2/. [4] M. Arrott, B. Demchak, V. Ermagan, C

  4. The molecular basis of conformational instability of the ecdysone receptor DNA binding domain studied by in silico and in vitro experiments.

    PubMed

    Szamborska-Gbur, Agnieszka; Rymarczyk, Grzegorz; Orłowski, Marek; Kuzynowski, Tomasz; Jakób, Michał; Dziedzic-Letka, Agnieszka; Górecki, Andrzej; Dobryszycki, Piotr; Ożyhar, Andrzej

    2014-01-01

    The heterodimer of the ecdysone receptor (EcR) and ultraspiracle (Usp), members of the nuclear receptors superfamily, regulates gene expression associated with molting and metamorphosis in insects. The DNA binding domains (DBDs) of the Usp and EcR play an important role in their DNA-dependent heterodimerization. Analysis of the crystal structure of the UspDBD/EcRDBD heterocomplex from Drosophila melanogaster on the hsp27 gene response element, suggested an appreciable similarity between both DBDs. However, the chemical denaturation experiments showed a categorically lower stability for the EcRDBD in contrast to the UspDBD. The aim of our study was an elucidation of the molecular basis of this intriguing instability. Toward this end, we mapped the EcRDBD amino acid sequence positions which have an impact on the stability of the EcRDBD. The computational protein design and in vitro analyses of the EcRDBD mutants indicate that non-conserved residues within the α-helix 2, forming the EcRDBD hydrophobic core, represent a specific structural element that contributes to instability. In particular, the L58 appears to be a key residue which differentiates the hydrophobic cores of UspDBD and EcRDBD and is the main reason for the low stability of the EcRDBD. Our results might serve as a benchmark for further studies of the intricate nature of the EcR molecule.

  5. Using modified fruit fly optimisation algorithm to perform the function test and case studies

    NASA Astrophysics Data System (ADS)

    Pan, Wen-Tsao

    2013-06-01

    Evolutionary computation is a computing mode established by practically simulating natural evolutionary processes based on the concept of Darwinian Theory, and it is a common research method. The main contribution of this paper was to reinforce the function of searching for the optimised solution using the fruit fly optimization algorithm (FOA), in order to avoid the acquisition of local extremum solutions. The evolutionary computation has grown to include the concepts of animal foraging behaviour and group behaviour. This study discussed three common evolutionary computation methods and compared them with the modified fruit fly optimization algorithm (MFOA). It further investigated the ability of the three mathematical functions in computing extreme values, as well as the algorithm execution speed and the forecast ability of the forecasting model built using the optimised general regression neural network (GRNN) parameters. The findings indicated that there was no obvious difference between particle swarm optimization and the MFOA in regards to the ability to compute extreme values; however, they were both better than the artificial fish swarm algorithm and FOA. In addition, the MFOA performed better than the particle swarm optimization in regards to the algorithm execution speed, and the forecast ability of the forecasting model built using the MFOA's GRNN parameters was better than that of the other three forecasting models.

  6. Social Media: Menagerie of Metrics

    DTIC Science & Technology

    2010-01-27

    intelligence, an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm . An EA...Cloning - 22 Animals were cloned to date; genetic algorithms can help prediction (e.g. “elitism” - attempts to ensure selection by including performers...28, 2010 Evolutionary Algorithm • Evolutionary algorithm From Wikipedia, the free encyclopedia Artificial intelligence portal In artificial

  7. eddy4R 0.2.0: a DevOps model for community-extensible processing and analysis of eddy-covariance data based on R, Git, Docker, and HDF5

    NASA Astrophysics Data System (ADS)

    Metzger, Stefan; Durden, David; Sturtevant, Cove; Luo, Hongyan; Pingintha-Durden, Natchaya; Sachs, Torsten; Serafimovich, Andrei; Hartmann, Jörg; Li, Jiahong; Xu, Ke; Desai, Ankur R.

    2017-08-01

    Large differences in instrumentation, site setup, data format, and operating system stymie the adoption of a universal computational environment for processing and analyzing eddy-covariance (EC) data. This results in limited software applicability and extensibility in addition to often substantial inconsistencies in flux estimates. Addressing these concerns, this paper presents the systematic development of portable, reproducible, and extensible EC software achieved by adopting a development and systems operation (DevOps) approach. This software development model is used for the creation of the eddy4R family of EC code packages in the open-source R language for statistical computing. These packages are community developed, iterated via the Git distributed version control system, and wrapped into a portable and reproducible Docker filesystem that is independent of the underlying host operating system. The HDF5 hierarchical data format then provides a streamlined mechanism for highly compressed and fully self-documented data ingest and output. The usefulness of the DevOps approach was evaluated for three test applications. First, the resultant EC processing software was used to analyze standard flux tower data from the first EC instruments installed at a National Ecological Observatory (NEON) field site. Second, through an aircraft test application, we demonstrate the modular extensibility of eddy4R to analyze EC data from other platforms. Third, an intercomparison with commercial-grade software showed excellent agreement (R2 = 1.0 for CO2 flux). In conjunction with this study, a Docker image containing the first two eddy4R packages and an executable example workflow, as well as first NEON EC data products are released publicly. We conclude by describing the work remaining to arrive at the automated generation of science-grade EC fluxes and benefits to the science community at large. This software development model is applicable beyond EC and more generally builds the capacity to deploy complex algorithms developed by scientists in an efficient and scalable manner. In addition, modularity permits meeting project milestones while retaining extensibility with time.

  8. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment.

    PubMed

    Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che

    2014-01-16

    To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high quality solutions can be obtained within relatively short time. This integrated approach is a promising way for inferring large networks.

  9. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment

    PubMed Central

    2014-01-01

    Background To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. Results This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Conclusions Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high quality solutions can be obtained within relatively short time. This integrated approach is a promising way for inferring large networks. PMID:24428926

  10. The design of a petabyte archive and distribution system for the NASA ECS project

    NASA Technical Reports Server (NTRS)

    Caulk, Parris M.

    1994-01-01

    The NASA EOS Data and Information System (EOSDIS) Core System (ECS) will contain one of the largest data management systems ever built - the ECS Science and Data Processing System (SDPS). SDPS is designed to support long term Global Change Research by acquiring, producing, and storing earth science data, and by providing efficient means for accessing and manipulating that data. The first two releases of SDPS, Release A and Release B, will be operational in 1997 and 1998, respectively. Release B will be deployed at eight Distributed Active Archiving Centers (DAAC's). Individual DAAC's will archive different collections of earth science data, and will vary in archive capacity. The storage and management of these data collections is the responsibility of the SDPS Data Server subsystem. It is anticipated that by the year 2001, the Data Server subsystem at the Goddard DAAC must support a near-line data storage capacity of one petabyte. The development of SDPS is a system integration effort in which COTS products will be used in favor of custom components in very possible way. Some software and hardware capabilities required to meet ECS data volume and storage management requirements beyond 1999 are not yet supported by available COTS products. The ECS project will not undertake major custom development efforts to provide these capabilities. Instead, SDPS and its Data Server subsystem are designed to support initial implementations with current products, and provide an evolutionary framework that facilitates the introduction of advanced COTS products as they become available. This paper provides a high-level description of the Data Server subsystem design from a COTS integration standpoint, and discussed some of the major issues driving the design. The paper focuses on features of the design that will make the system scalable and adaptable to changing technologies.

  11. Evolutionary computing for the design search and optimization of space vehicle power subsystems

    NASA Technical Reports Server (NTRS)

    Kordon, M.; Klimeck, G.; Hanks, D.

    2004-01-01

    Evolutionary computing has proven to be a straightforward and robust approach for optimizing a wide range of difficult analysis and design problems. This paper discusses the application of these techniques to an existing space vehicle power subsystem resource and performance analysis simulation in a parallel processing environment.

  12. Reconstructing evolutionary trees in parallel for massive sequences.

    PubMed

    Zou, Quan; Wan, Shixiang; Zeng, Xiangxiang; Ma, Zhanshan Sam

    2017-12-14

    Building the evolutionary trees for massive unaligned DNA sequences is challenging and crucial. However, reconstructing evolutionary tree for ultra-large sequences is hard. Massive multiple sequence alignment is also challenging and time/space consuming. Hadoop and Spark are developed recently, which bring spring light for the classical computational biology problems. In this paper, we tried to solve the multiple sequence alignment and evolutionary reconstruction in parallel. HPTree, which is developed in this paper, can deal with big DNA sequence files quickly. It works well on the >1GB files, and gets better performance than other evolutionary reconstruction tools. Users could use HPTree for reonstructing evolutioanry trees on the computer clusters or cloud platform (eg. Amazon Cloud). HPTree could help on population evolution research and metagenomics analysis. In this paper, we employ the Hadoop and Spark platform and design an evolutionary tree reconstruction software tool for unaligned massive DNA sequences. Clustering and multiple sequence alignment are done in parallel. Neighbour-joining model was employed for the evolutionary tree building. We opened our software together with source codes via http://lab.malab.cn/soft/HPtree/ .

  13. Specificity and Evolutionary Conservation of the Escherichia coli RNA Pyrophosphohydrolase RppH*

    PubMed Central

    Foley, Patricia L.; Hsieh, Ping-kun; Luciano, Daniel J.; Belasco, Joel G.

    2015-01-01

    Bacterial RNA degradation often begins with conversion of the 5′-terminal triphosphate to a monophosphate by the RNA pyrophosphohydrolase RppH, an event that triggers rapid ribonucleolytic attack. Besides its role as the master regulator of 5′-end-dependent mRNA decay, RppH is important for the ability of pathogenic bacteria to invade host cells, yet little is known about how it chooses its targets. Here, we show that Escherichia coli RppH (EcRppH) requires at least two unpaired nucleotides at the RNA 5′ end and prefers three or more such nucleotides. It can tolerate any nucleotide at the first three positions but has a modest preference for A at the 5′ terminus and either a G or A at the second position. Mutational analysis has identified EcRppH residues crucial for substrate recognition or catalysis. The promiscuity of EcRppH differentiates it from its Bacillus subtilis counterpart, which has a strict RNA sequence requirement. EcRppH orthologs likely to share its relaxed sequence specificity are widespread in all classes of Proteobacteria, except Deltaproteobacteria, and in flowering plants. By contrast, the phylogenetic range of recognizable B. subtilis RppH orthologs appears to be restricted to the order Bacillales. These findings help to explain the selective influence of RppH on bacterial mRNA decay and show that RppH-dependent degradation has diversified significantly during the course of evolution. PMID:25657006

  14. Specificity and evolutionary conservation of the Escherichia coli RNA pyrophosphohydrolase RppH.

    PubMed

    Foley, Patricia L; Hsieh, Ping-kun; Luciano, Daniel J; Belasco, Joel G

    2015-04-10

    Bacterial RNA degradation often begins with conversion of the 5'-terminal triphosphate to a monophosphate by the RNA pyrophosphohydrolase RppH, an event that triggers rapid ribonucleolytic attack. Besides its role as the master regulator of 5'-end-dependent mRNA decay, RppH is important for the ability of pathogenic bacteria to invade host cells, yet little is known about how it chooses its targets. Here, we show that Escherichia coli RppH (EcRppH) requires at least two unpaired nucleotides at the RNA 5' end and prefers three or more such nucleotides. It can tolerate any nucleotide at the first three positions but has a modest preference for A at the 5' terminus and either a G or A at the second position. Mutational analysis has identified EcRppH residues crucial for substrate recognition or catalysis. The promiscuity of EcRppH differentiates it from its Bacillus subtilis counterpart, which has a strict RNA sequence requirement. EcRppH orthologs likely to share its relaxed sequence specificity are widespread in all classes of Proteobacteria, except Deltaproteobacteria, and in flowering plants. By contrast, the phylogenetic range of recognizable B. subtilis RppH orthologs appears to be restricted to the order Bacillales. These findings help to explain the selective influence of RppH on bacterial mRNA decay and show that RppH-dependent degradation has diversified significantly during the course of evolution. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  15. MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.

    PubMed

    Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro

    2018-06-01

    The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.

  16. Eco-Evo PVAs: Incorporating Eco-Evolutionary Processes into Population Viability Models

    EPA Science Inventory

    We synthesize how advances in computational methods and population genomics can be combined within an Ecological-Evolutionary (Eco-Evo) PVA model. Eco-Evo PVA models are powerful new tools for understanding the influence of evolutionary processes on plant and animal population pe...

  17. Probing the Interaction between Cyclic ADTC1 Ac-CADTPPVC-NH2) Peptide with EC1-EC2 domain of E-cadherin using Molecular Docking Approach

    NASA Astrophysics Data System (ADS)

    Siahaan, P.; Wuning, S.; Manna, A.; Prasasty, V. D.; Hudiyanti, D.

    2018-04-01

    Deeply understanding that intermolecular interaction between molecules on the paracellular pathway has given insight to its microscopic and macroscopic properties. In the paracellular pathway, synthetic cyclic ADTC1 (Ac-CADTPPVC-NH2) peptide has been studied to modulate EC1-EC2 domain, computationally using molecular docking method. The aim of this research is to probe the effect of amino acid alanine (A) of ADTC1 on its interaction properties. The study carried out in two steps: 1. the optimization using GROMACS v4.6.5 program and; 2. Determination of the interaction properties using AutoDock 4.2 program. The interaction was done for A-J box, and the best position of the binding site and binding energy on the OC and CC ADTC1 peptides against the EC1-EC2 domain of E-cadherin was selected. The result showed that the CC of the F box ADTC1 has the best interaction with binding energy of - 26.36 kJ/mol and its energy was lower than ADTC5 without alanine amino acid. ADTC1 interacted with EC1 of EC1-EC2 on Asp1, Trp2, Val3, Ile4, Ile24, Lys25, Ser26, Asn27, and Met92 residues.

  18. Learning, epigenetics, and computation: An extension on Fitch's proposal. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by W. Tecumseh Fitch

    NASA Astrophysics Data System (ADS)

    Okanoya, Kazuo

    2014-09-01

    The comparative computational approach of Fitch [1] attempts to renew the classical David Marr paradigm of computation, algorithm, and implementation, by introducing evolutionary view of the relationship between neural architecture and cognition. This comparative evolutionary view provides constraints useful in narrowing down the problem space for both cognition and neural mechanisms. I will provide two examples from our own studies that reinforce and extend Fitch's proposal.

  19. Resources and costs for microbial sequence analysis evaluated using virtual machines and cloud computing.

    PubMed

    Angiuoli, Samuel V; White, James R; Matalka, Malcolm; White, Owen; Fricke, W Florian

    2011-01-01

    The widespread popularity of genomic applications is threatened by the "bioinformatics bottleneck" resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S rRNA amplicon sequencing, microbial single-genome and metagenomics WGS projects can achieve cost-efficient bioinformatics support using CloVR in combination with Amazon EC2 as an alternative to local computing centers.

  20. Resources and Costs for Microbial Sequence Analysis Evaluated Using Virtual Machines and Cloud Computing

    PubMed Central

    Angiuoli, Samuel V.; White, James R.; Matalka, Malcolm; White, Owen; Fricke, W. Florian

    2011-01-01

    Background The widespread popularity of genomic applications is threatened by the “bioinformatics bottleneck” resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. Results We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Conclusions Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S rRNA amplicon sequencing, microbial single-genome and metagenomics WGS projects can achieve cost-efficient bioinformatics support using CloVR in combination with Amazon EC2 as an alternative to local computing centers. PMID:22028928

  1. Learning Evolution and the Nature of Science Using Evolutionary Computing and Artificial Life

    ERIC Educational Resources Information Center

    Pennock, Robert T.

    2007-01-01

    Because evolution in natural systems happens so slowly, it is difficult to design inquiry-based labs where students can experiment and observe evolution in the way they can when studying other phenomena. New research in evolutionary computation and artificial life provides a solution to this problem. This paper describes a new A-Life software…

  2. A Novel Handwritten Letter Recognizer Using Enhanced Evolutionary Neural Network

    NASA Astrophysics Data System (ADS)

    Mahmoudi, Fariborz; Mirzashaeri, Mohsen; Shahamatnia, Ehsan; Faridnia, Saed

    This paper introduces a novel design for handwritten letter recognition by employing a hybrid back-propagation neural network with an enhanced evolutionary algorithm. Feeding the neural network consists of a new approach which is invariant to translation, rotation, and scaling of input letters. Evolutionary algorithm is used for the global search of the search space and the back-propagation algorithm is used for the local search. The results have been computed by implementing this approach for recognizing 26 English capital letters in the handwritings of different people. The computational results show that the neural network reaches very satisfying results with relatively scarce input data and a promising performance improvement in convergence of the hybrid evolutionary back-propagation algorithms is exhibited.

  3. Using concepts from biology to improve problem-solving methods

    NASA Astrophysics Data System (ADS)

    Goodman, Erik D.; Rothwell, Edward J.; Averill, Ronald C.

    2011-06-01

    Observing nature has been a cornerstone of engineering design. Today, engineers look not only at finished products, but imitate the evolutionary process by which highly optimized artifacts have appeared in nature. Evolutionary computation began by capturing only the simplest ideas of evolution, but today, researchers study natural evolution and incorporate an increasing number of concepts in order to evolve solutions to complex engineering problems. At the new BEACON Center for the Study of Evolution in Action, studies in the lab and field and in silico are laying the groundwork for new tools for evolutionary engineering design. This paper, which accompanies a keynote address, describes various steps in development and application of evolutionary computation, particularly as regards sensor design, and sets the stage for future advances.

  4. Query engine optimization for the EHR4CR protocol feasibility scenario.

    PubMed

    Soto-Rey, Iñaki; Bache, Richard; Dugas, Martin; Fritz, Fleur

    2013-01-01

    An essential step when recruiting patients for a Clinical Trial (CT) is to determine the number of patients that satisfy the Eligibility Criteria (ECs) for that trial. An innovative feature of the Electronic Health Records for Clinical Research (EHR4CR) platform is that when automatically determining patient counts, it also allows the user to view counts for subsets of the ECs. This is helpful because some combinations of ECs may be so restrictive that they yield very few or zero patients. If we wanted to show all possible combinations of ECs, the number of queries we would have to execute would be of 2n, where n is the total number of ECs. Assuming that an average study has between 20 and 30 ECs, the program would have to execute between 220 (1,048,576) and 230 (1,073,741,824) queries. This is not only computationally expensive but also impractical to visualise. The purpose of our research is to reduce possible combinationsto a manageable number.

  5. Evolutionary Computation for the Identification of Emergent Behavior in Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Terrile, Richard J.; Guillaume, Alexandre

    2009-01-01

    Over the past several years the Center for Evolutionary Computation and Automated Design at the Jet Propulsion Laboratory has developed a technique based on Evolutionary Computational Methods (ECM) that allows for the automated optimization of complex computationally modeled systems. An important application of this technique is for the identification of emergent behaviors in autonomous systems. Mobility platforms such as rovers or airborne vehicles are now being designed with autonomous mission controllers that can find trajectories over a solution space that is larger than can reasonably be tested. It is critical to identify control behaviors that are not predicted and can have surprising results (both good and bad). These emergent behaviors need to be identified, characterized and either incorporated into or isolated from the acceptable range of control characteristics. We use cluster analysis of automatically retrieved solutions to identify isolated populations of solutions with divergent behaviors.

  6. Exploring Evolutionary Patterns in Genetic Sequence: A Computer Exercise

    ERIC Educational Resources Information Center

    Shumate, Alice M.; Windsor, Aaron J.

    2010-01-01

    The increase in publications presenting molecular evolutionary analyses and the availability of comparative sequence data through resources such as NCBI's GenBank underscore the necessity of providing undergraduates with hands-on sequence analysis skills in an evolutionary context. This need is particularly acute given that students have been…

  7. Photoinduced Single- and Multiple-Electron Dynamics Processes Enhanced by Quantum Confinement in Lead Halide Perovskite Quantum Dots

    DOE PAGES

    Vogel, Dayton J.; Kryjevski, Andrei; Inerbaev, Talgat; ...

    2017-03-21

    Methylammonium lead iodide perovskite (MAPbI 3) is a promising material for photovoltaic devices. A modification of MAPbI 3 into confined nanostructures is expected to further increase efficiency of solar energy conversion. Photoexcited dynamic processes in a MAPbI3 quantum dot (QD) have been modeled by many-body perturbation theory and nonadiabatic dynamics. A photoexcitation is followed by either exciton cooling (EC), its radiative (RR) or nonradiative recombination (NRR), or multiexciton generation (MEG) processes. Computed times of these processes fall in the order of MEG < EC < RR < NRR, where MEG is on the order of a few femtoseconds, EC ismore » in the picosecond range, while RR and NRR are on the order of nanoseconds. Computed time scales indicate which electronic transition pathways can contribute to increase in charge collection efficiency. Simulated mechanisms of relaxation and their rates show that quantum confinement promotes MEG in MAPbI 3 QDs.« less

  8. Inference of Evolutionary Jumps in Large Phylogenies using Lévy Processes

    PubMed Central

    Duchen, Pablo; Leuenberger, Christoph; Szilágyi, Sándor M.; Harmon, Luke; Eastman, Jonathan; Schweizer, Manuel

    2017-01-01

    Abstract Although it is now widely accepted that the rate of phenotypic evolution may not necessarily be constant across large phylogenies, the frequency and phylogenetic position of periods of rapid evolution remain unclear. In his highly influential view of evolution, G. G. Simpson supposed that such evolutionary jumps occur when organisms transition into so-called new adaptive zones, for instance after dispersal into a new geographic area, after rapid climatic changes, or following the appearance of an evolutionary novelty. Only recently, large, accurate and well calibrated phylogenies have become available that allow testing this hypothesis directly, yet inferring evolutionary jumps remains computationally very challenging. Here, we develop a computationally highly efficient algorithm to accurately infer the rate and strength of evolutionary jumps as well as their phylogenetic location. Following previous work we model evolutionary jumps as a compound process, but introduce a novel approach to sample jump configurations that does not require matrix inversions and thus naturally scales to large trees. We then make use of this development to infer evolutionary jumps in Anolis lizards and Loriinii parrots where we find strong signal for such jumps at the basis of clades that transitioned into new adaptive zones, just as postulated by Simpson’s hypothesis. [evolutionary jump; Lévy process; phenotypic evolution; punctuated equilibrium; quantitative traits. PMID:28204787

  9. Children’s Sleep and Academic Achievement: The Moderating Role of Effortful Control

    PubMed Central

    Diaz, Anjolii; Berger, Rebecca; Valiente, Carlos; Eisenberg, Nancy; VanSchyndel, Sarah; Tao, Chun; Spinrad, Tracy L.; Doane, Leah D.; Thompson, Marilyn S.; Silva, Kassondra M.; Southworth, Jody

    2016-01-01

    Poor sleep is thought to interfere with children’s learning and academic achievement (AA). However, existing research and theory indicate there are factors that may mitigate the academic risk associated with poor sleep. The purpose of this study was to examine the moderating role of children’s effortful control (EC) on the relation between sleep and AA in young children. One hundred and three 4.5- to 7-year-olds (M = 5.98 years, SD = 0.61) wore a wrist-based actigraph for five continuous weekday nights. Teachers and coders reported on children’s EC. EC was also assessed with a computer-based task at school. Additionally, we obtained a standardized measure of children’s AA. There was a positive main effect of sleep efficiency to AA. Several relations between sleep and AA were moderated by EC and examination of the simple slopes indicated that the negative relation between sleep and AA was only significant at low levels of EC. PMID:28255190

  10. Open Reading Frame Phylogenetic Analysis on the Cloud

    PubMed Central

    2013-01-01

    Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843

  11. A System for Managing Replenishment of a Nutrient Solution Using an Electrical Conductivity Controller

    NASA Technical Reports Server (NTRS)

    Davis, D.; Dogan, N.; Aglan, H.; Mortley, D.; Loretan, P.

    1998-01-01

    Control of nutrient solution parameters is very important for the growth and development of plants grown hydroponically. Protocols involving different nutrient solution replenishment times (e.g. one-week, two-week, or two-day replenishment) provide manual periodic control of the nutrient solution's electrical conductivity (EC). Since plants take-up nutrients as they grow, manual control has a drawback in that EC is not held constant between replenishments. In an effort to correct this problem the Center for Food and Environmental Systems for Human Exploration of Space at Tuskegee University has developed a system for managing and controlling levels of EC over a plant's entire growing cycle. A prototype system is being tested on sweetpotato production using the nutrient film technique (NFT), and it is being compared to a system in which sweetpotatoes are grown using NFT with manual control. NASA has played an important role in the development of environmental control systems. They have become a forerunner in growing plants hydroponically with some control systems through the use of networked data acquisition and control using environmental growth chambers. Data acquisition systems which involve the use of real-time, calibration, set points, user panel, and graphical representation programming provide a good method of controlling nutrient solution parameters such as EC and pH [Bledsoe, 19931]. In NASA's Biomass Production Chamber (BPC) at Kennedy Space Center, control is provided by a programmable logic controller (PLC). This is an industrial controller which combines ladder computer logic which has the ability to handle various levels of electrical power. The controller controls temperature, light and other parameters that affect the plant's environment, in the BPC, the Nutrient Delivery System (NIX), a sub-system of the PLC, controls nutrient solution parameters such as EC, pH, and solution levels. When the nutrient EC measurement goes outside a preset range (120-130 mS/m) a set amount of a stock solution of nutrients is automatically added by a metering pump to bring the EC back into operating range [Fortson, 1992]. This paper describes a system developed at Tuskegee University for controlling the EC of a nutrient solution used for growing sweetpotatoes with an EC controller and a computer with LabView data acquisition and instrumentation software. It also describes the preliminary data obtained from the growth of sweetpotatoes using this prototype control system.

  12. Computationally mapping sequence space to understand evolutionary protein engineering.

    PubMed

    Armstrong, Kathryn A; Tidor, Bruce

    2008-01-01

    Evolutionary protein engineering has been dramatically successful, producing a wide variety of new proteins with altered stability, binding affinity, and enzymatic activity. However, the success of such procedures is often unreliable, and the impact of the choice of protein, engineering goal, and evolutionary procedure is not well understood. We have created a framework for understanding aspects of the protein engineering process by computationally mapping regions of feasible sequence space for three small proteins using structure-based design protocols. We then tested the ability of different evolutionary search strategies to explore these sequence spaces. The results point to a non-intuitive relationship between the error-prone PCR mutation rate and the number of rounds of replication. The evolutionary relationships among feasible sequences reveal hub-like sequences that serve as particularly fruitful starting sequences for evolutionary search. Moreover, genetic recombination procedures were examined, and tradeoffs relating sequence diversity and search efficiency were identified. This framework allows us to consider the impact of protein structure on the allowed sequence space and therefore on the challenges that each protein presents to error-prone PCR and genetic recombination procedures.

  13. Li + solvation and kinetics of Li +–BF 4 -/PF 6 - ion pairs in ethylene carbonate. A molecular dynamics study with classical rate theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Tsun-Mei; Dang, Liem X.

    Using our polarizable force-field models and employing classical rate theories of chemical reactions, we examine the ethylene carbonate (EC) exchange process between the first and second solvation shells around Li+ and the dissociation kinetics of ion pairs Li+-[BF4] and Li+-[PF6] in this solvent. We calculate the exchange rates using transition state theory and correct them with transmission coefficients computed by the reactive flux; Impey, Madden, and McDonald approaches; and Grote-Hynes theory. We found the residence times of EC around Li+ ions varied from 70 to 450 ps, depending on the correction method used. We found the relaxation times changed significantlymore » from Li+-[BF4] to Li+-[PF6] ion pairs in EC. Our results also show that, in addition to affecting the free energy of dissociation in EC, the anion type also significantly influence the dissociation kinetics of ion pairing. This work was supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.« less

  14. Challenges to "Classic" Esophageal Candidiasis: Looks Are Usually Deceiving.

    PubMed

    Alsomali, Mohammed I; Arnold, Michael A; Frankel, Wendy L; Graham, Rondell P; Hart, Phil A; Lam-Himlin, Dora M; Naini, Bita V; Voltaggio, Lysandra; Arnold, Christina A

    2017-01-01

    We undertook the first case control study of histologically confirmed esophageal candidiasis (EC). A computer search from July 2012 through February 2015 identified 1,011 esophageal specimens, including 40 cases of EC and 20 controls. The EC incidence was 5.2%; it was associated with immunosuppression and endoscopic white plaques and breaks. Smoking was a predisposing factor, and alcohol was protective. EC had no unique symptoms, and 54% of endoscopic reports did not suspect EC. Important histologic clues included superficial and detached fragments of desquamated and hyper-pink parakeratosis, acute inflammation, intraepithelial lymphocytosis, dead keratinocytes, and bacterial overgrowth. Thirty percent had no neutrophilic infiltrate. Pseudohyphae were seen on H&E in 92.5% (n = 37/40). "Upfront" periodic acid-Schiff with diastase (PAS/D) on all esophageal specimens would have generated $68,333.49 in patient charges. Our targeted PAS/D strategy resulted in $13,044.87 in patient charges (cost saving = 80.9%, $55,288.62). We describe the typical morphology of EC and recommend limiting PAS/D to cases where the organisms are not readily identifiable on H&E and with at least one of the following: (1) ulcer, (2) suspicious morphology, and/or (3) clinical impression of EC. © American Society for Clinical Pathology, 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  15. Microcomputers, Evaluation, Literacy: Will the Teacher Survive?

    ERIC Educational Resources Information Center

    Hofmann, Richard J., Ed.

    1982-01-01

    The development of computer technology is considered, the concept of computer literacy is defined, and the role of teachers in educational microcomputer programs is discussed. The field of commercially produced software for microcomputers is reviewed. (For related articles, see EC 142 959-962.) (Author)

  16. Computer Science in High School Graduation Requirements. ECS Education Trends (Updated)

    ERIC Educational Resources Information Center

    Zinth, Jennifer

    2016-01-01

    Allowing high school students to fulfill a math or science high school graduation requirement via a computer science credit may encourage more student to pursue computer science coursework. This Education Trends report is an update to the original report released in April 2015 and explores state policies that allow or require districts to apply…

  17. Enabling Customization through Web Development: An Iterative Study of the Dell Computer Corporation Website

    ERIC Educational Resources Information Center

    Liu, Chang; Mackie, Brian G.

    2008-01-01

    Throughout the last decade, companies have increased their investment in electronic commerce (EC) by developing and implementing Web-based applications on the Internet. This paper describes a class project to develop a customized computer website which is similar to Dell Computer Corporation's (Dell) website. The objective of this project is to…

  18. Stochastic Evolutionary Algorithms for Planning Robot Paths

    NASA Technical Reports Server (NTRS)

    Fink, Wolfgang; Aghazarian, Hrand; Huntsberger, Terrance; Terrile, Richard

    2006-01-01

    A computer program implements stochastic evolutionary algorithms for planning and optimizing collision-free paths for robots and their jointed limbs. Stochastic evolutionary algorithms can be made to produce acceptably close approximations to exact, optimal solutions for path-planning problems while often demanding much less computation than do exhaustive-search and deterministic inverse-kinematics algorithms that have been used previously for this purpose. Hence, the present software is better suited for application aboard robots having limited computing capabilities (see figure). The stochastic aspect lies in the use of simulated annealing to (1) prevent trapping of an optimization algorithm in local minima of an energy-like error measure by which the fitness of a trial solution is evaluated while (2) ensuring that the entire multidimensional configuration and parameter space of the path-planning problem is sampled efficiently with respect to both robot joint angles and computation time. Simulated annealing is an established technique for avoiding local minima in multidimensional optimization problems, but has not, until now, been applied to planning collision-free robot paths by use of low-power computers.

  19. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.

    PubMed

    Menges, Achim

    2012-03-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.

  20. Evaluative conditioning: A brief computer-delivered intervention to reduce college student drinking.

    PubMed

    Tello, Nina; Bocage-Barthélémy, Yvana; Dandaba, Meira; Jaafari, Nematollah; Chatard, Armand

    2018-07-01

    Recent research suggests that a brief computer-delivered intervention based on evaluative conditioning (EC) can change the implicit evaluation of alcohol and reduce drinking behaviors among college students. We tested whether we could obtain similar findings in a high-powered preregistered study and whether hazardous drinking moderates these effects. Before the intervention, 122 French college students were screened for hazardous drinking using the Alcohol Use Disorder Identification Test (AUDIT). Implicit evaluation of alcohol was assessed before and immediately after the intervention using an Implicit Association Test (IAT). Drinking behavior was assessed before the intervention and approximately two weeks after using the TimeLine Follow Back (TLFB) method. The EC consisted of 120 trials of words (related to alcoholic beverages, soft drinks or neutral) paired with pictures (neutral, positive or negative). In the EC condition, alcohol-related words were systematically paired with negative pictures. In the control condition, alcohol-related words were systematically paired with neutral pictures. The EC did not change the implicit evaluation of alcohol, Cohen's d = 0.01, 95CI [-0.35, 0.35]. However, the EC reduced drinking behavior, Cohen's d = 0.37, 95CI [0.01, 0.72]. This effect was independent of hazardous drinking behavior, but it was especially pronounced among participants with the most positive implicit evaluation of alcohol before the intervention. This preregistered study suggests that evaluative conditioning can successfully reduce drinking behavior among college students by 31% (compared to 4% in the control condition) without causing an immediate change in the implicit evaluation of alcohol. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Accurate motor mapping in awake common marmosets using micro-electrocorticographical stimulation and stochastic threshold estimation

    NASA Astrophysics Data System (ADS)

    Kosugi, Akito; Takemi, Mitsuaki; Tia, Banty; Castagnola, Elisa; Ansaldo, Alberto; Sato, Kenta; Awiszus, Friedemann; Seki, Kazuhiko; Ricci, Davide; Fadiga, Luciano; Iriki, Atsushi; Ushiba, Junichi

    2018-06-01

    Objective. Motor map has been widely used as an indicator of motor skills and learning, cortical injury, plasticity, and functional recovery. Cortical stimulation mapping using epidural electrodes is recently adopted for animal studies. However, several technical limitations still remain. Test-retest reliability of epidural cortical stimulation (ECS) mapping has not been examined in detail. Many previous studies defined evoked movements and motor thresholds by visual inspection, and thus, lacked quantitative measurements. A reliable and quantitative motor map is important to elucidate the mechanisms of motor cortical reorganization. The objective of the current study was to perform reliable ECS mapping of motor representations based on the motor thresholds, which were stochastically estimated by motor evoked potentials and chronically implanted micro-electrocorticographical (µECoG) electrode arrays, in common marmosets. Approach. ECS was applied using the implanted µECoG electrode arrays in three adult common marmosets under awake conditions. Motor evoked potentials were recorded through electromyographical electrodes implanted in upper limb muscles. The motor threshold was calculated through a modified maximum likelihood threshold-hunting algorithm fitted with the recorded data from marmosets. Further, a computer simulation confirmed reliability of the algorithm. Main results. Computer simulation suggested that the modified maximum likelihood threshold-hunting algorithm enabled to estimate motor threshold with acceptable precision. In vivo ECS mapping showed high test-retest reliability with respect to the excitability and location of the cortical forelimb motor representations. Significance. Using implanted µECoG electrode arrays and a modified motor threshold-hunting algorithm, we were able to achieve reliable motor mapping in common marmosets with the ECS system.

  2. Accurate motor mapping in awake common marmosets using micro-electrocorticographical stimulation and stochastic threshold estimation.

    PubMed

    Kosugi, Akito; Takemi, Mitsuaki; Tia, Banty; Castagnola, Elisa; Ansaldo, Alberto; Sato, Kenta; Awiszus, Friedemann; Seki, Kazuhiko; Ricci, Davide; Fadiga, Luciano; Iriki, Atsushi; Ushiba, Junichi

    2018-06-01

    Motor map has been widely used as an indicator of motor skills and learning, cortical injury, plasticity, and functional recovery. Cortical stimulation mapping using epidural electrodes is recently adopted for animal studies. However, several technical limitations still remain. Test-retest reliability of epidural cortical stimulation (ECS) mapping has not been examined in detail. Many previous studies defined evoked movements and motor thresholds by visual inspection, and thus, lacked quantitative measurements. A reliable and quantitative motor map is important to elucidate the mechanisms of motor cortical reorganization. The objective of the current study was to perform reliable ECS mapping of motor representations based on the motor thresholds, which were stochastically estimated by motor evoked potentials and chronically implanted micro-electrocorticographical (µECoG) electrode arrays, in common marmosets. ECS was applied using the implanted µECoG electrode arrays in three adult common marmosets under awake conditions. Motor evoked potentials were recorded through electromyographical electrodes implanted in upper limb muscles. The motor threshold was calculated through a modified maximum likelihood threshold-hunting algorithm fitted with the recorded data from marmosets. Further, a computer simulation confirmed reliability of the algorithm. Computer simulation suggested that the modified maximum likelihood threshold-hunting algorithm enabled to estimate motor threshold with acceptable precision. In vivo ECS mapping showed high test-retest reliability with respect to the excitability and location of the cortical forelimb motor representations. Using implanted µECoG electrode arrays and a modified motor threshold-hunting algorithm, we were able to achieve reliable motor mapping in common marmosets with the ECS system.

  3. Inference of Evolutionary Jumps in Large Phylogenies using Lévy Processes.

    PubMed

    Duchen, Pablo; Leuenberger, Christoph; Szilágyi, Sándor M; Harmon, Luke; Eastman, Jonathan; Schweizer, Manuel; Wegmann, Daniel

    2017-11-01

    Although it is now widely accepted that the rate of phenotypic evolution may not necessarily be constant across large phylogenies, the frequency and phylogenetic position of periods of rapid evolution remain unclear. In his highly influential view of evolution, G. G. Simpson supposed that such evolutionary jumps occur when organisms transition into so-called new adaptive zones, for instance after dispersal into a new geographic area, after rapid climatic changes, or following the appearance of an evolutionary novelty. Only recently, large, accurate and well calibrated phylogenies have become available that allow testing this hypothesis directly, yet inferring evolutionary jumps remains computationally very challenging. Here, we develop a computationally highly efficient algorithm to accurately infer the rate and strength of evolutionary jumps as well as their phylogenetic location. Following previous work we model evolutionary jumps as a compound process, but introduce a novel approach to sample jump configurations that does not require matrix inversions and thus naturally scales to large trees. We then make use of this development to infer evolutionary jumps in Anolis lizards and Loriinii parrots where we find strong signal for such jumps at the basis of clades that transitioned into new adaptive zones, just as postulated by Simpson's hypothesis. [evolutionary jump; Lévy process; phenotypic evolution; punctuated equilibrium; quantitative traits. The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  4. Big cat phylogenies, consensus trees, and computational thinking.

    PubMed

    Sul, Seung-Jin; Williams, Tiffani L

    2011-07-01

    Phylogenetics seeks to deduce the pattern of relatedness between organisms by using a phylogeny or evolutionary tree. For a given set of organisms or taxa, there may be many evolutionary trees depicting how these organisms evolved from a common ancestor. As a result, consensus trees are a popular approach for summarizing the shared evolutionary relationships in a group of trees. We examine these consensus techniques by studying how the pantherine lineage of cats (clouded leopard, jaguar, leopard, lion, snow leopard, and tiger) evolved, which is hotly debated. While there are many phylogenetic resources that describe consensus trees, there is very little information, written for biologists, regarding the underlying computational techniques for building them. The pantherine cats provide us with a small, relevant example to explore the computational techniques (such as sorting numbers, hashing functions, and traversing trees) for constructing consensus trees. Our hope is that life scientists enjoy peeking under the computational hood of consensus tree construction and share their positive experiences with others in their community.

  5. MEGA5: Molecular Evolutionary Genetics Analysis Using Maximum Likelihood, Evolutionary Distance, and Maximum Parsimony Methods

    PubMed Central

    Tamura, Koichiro; Peterson, Daniel; Peterson, Nicholas; Stecher, Glen; Nei, Masatoshi; Kumar, Sudhir

    2011-01-01

    Comparative analysis of molecular sequence data is essential for reconstructing the evolutionary histories of species and inferring the nature and extent of selective forces shaping the evolution of genes and species. Here, we announce the release of Molecular Evolutionary Genetics Analysis version 5 (MEGA5), which is a user-friendly software for mining online databases, building sequence alignments and phylogenetic trees, and using methods of evolutionary bioinformatics in basic biology, biomedicine, and evolution. The newest addition in MEGA5 is a collection of maximum likelihood (ML) analyses for inferring evolutionary trees, selecting best-fit substitution models (nucleotide or amino acid), inferring ancestral states and sequences (along with probabilities), and estimating evolutionary rates site-by-site. In computer simulation analyses, ML tree inference algorithms in MEGA5 compared favorably with other software packages in terms of computational efficiency and the accuracy of the estimates of phylogenetic trees, substitution parameters, and rate variation among sites. The MEGA user interface has now been enhanced to be activity driven to make it easier for the use of both beginners and experienced scientists. This version of MEGA is intended for the Windows platform, and it has been configured for effective use on Mac OS X and Linux desktops. It is available free of charge from http://www.megasoftware.net. PMID:21546353

  6. Informatics in Radiology: Dual-Energy Electronic Cleansing for Fecal-Tagging CT Colonography

    PubMed Central

    Kim, Se Hyung; Lee, June-Goo; Yoshida, Hiroyuki

    2013-01-01

    Electronic cleansing (EC) is an emerging technique for the removal of tagged fecal materials at fecal-tagging computed tomographic (CT) colonography. However, existing EC methods may generate various types of artifacts that severely impair the quality of the cleansed CT colonographic images. Dual-energy fecal-tagging CT colonography is regarded as a next-generation imaging modality. EC that makes use of dual-energy fecal-tagging CT colonographic images promises to be effective in reducing cleansing artifacts by means of applying the material decomposition capability of dual-energy CT. The dual-energy index (DEI), which is calculated from the relative change in the attenuation values of a material at two different photon energies, is a reliable and effective indicator for differentiating tagged fecal materials from various types of tissues on fecal-tagging CT colonographic images. A DEI-based dual-energy EC scheme uses the DEI to help differentiate the colonic lumen—including the luminal air, tagged fecal materials, and air-tagging mixture—from the colonic soft-tissue structures, and then segments the entire colonic lumen for cleansing of the tagged fecal materials. As a result, dual-energy EC can help identify partial-volume effects in the air-tagging mixture and inhomogeneous tagging in residual fecal materials, the major causes of EC artifacts. This technique has the potential to significantly improve the quality of EC and promises to provide images of a cleansed colon that are free of the artifacts commonly observed with conventional single-energy EC methods. © RSNA, 2013 PMID:23479680

  7. A Cloud-Based Simulation Architecture for Pandemic Influenza Simulation

    PubMed Central

    Eriksson, Henrik; Raciti, Massimiliano; Basile, Maurizio; Cunsolo, Alessandro; Fröberg, Anders; Leifler, Ola; Ekberg, Joakim; Timpka, Toomas

    2011-01-01

    High-fidelity simulations of pandemic outbreaks are resource consuming. Cluster-based solutions have been suggested for executing such complex computations. We present a cloud-based simulation architecture that utilizes computing resources both locally available and dynamically rented online. The approach uses the Condor framework for job distribution and management of the Amazon Elastic Computing Cloud (EC2) as well as local resources. The architecture has a web-based user interface that allows users to monitor and control simulation execution. In a benchmark test, the best cost-adjusted performance was recorded for the EC2 H-CPU Medium instance, while a field trial showed that the job configuration had significant influence on the execution time and that the network capacity of the master node could become a bottleneck. We conclude that it is possible to develop a scalable simulation environment that uses cloud-based solutions, while providing an easy-to-use graphical user interface. PMID:22195089

  8. Molecular Evolution of Ultraspiracle Protein (USP/RXR) in Insects

    PubMed Central

    Hult, Ekaterina F.; Tobe, Stephen S.; Chang, Belinda S. W.

    2011-01-01

    Ultraspiracle protein/retinoid X receptor (USP/RXR) is a nuclear receptor and transcription factor which is an essential component of a heterodimeric receptor complex with the ecdysone receptor (EcR). In insects this complex binds ecdysteroids and plays an important role in the regulation of growth, development, metamorphosis and reproduction. In some holometabolous insects, including Lepidoptera and Diptera, USP/RXR is thought to have experienced several important shifts in function. These include the acquisition of novel ligand-binding properties and an expanded dimerization interface with EcR. In light of these recent hypotheses, we implemented codon-based likelihood methods to investigate if the proposed shifts in function are reflected in changes in site-specific evolutionary rates across functional and structural motifs in insect USP/RXR sequences, and if there is any evidence for positive selection at functionally important sites. Our results reveal evidence of positive selection acting on sites within the loop connecting helices H1 and H3, the ligand-binding pocket, and the dimer interface in the holometabolous lineage leading to the Lepidoptera/Diptera/Trichoptera. Similar analyses conducted using EcR sequences did not indicate positive selection. However, analyses allowing for variation across sites demonstrated elevated non-synonymous/synonymous rate ratios (d N/d S), suggesting relaxed constraint, within the dimerization interface of both USP/RXR and EcR as well as within the coactivator binding groove and helix H12 of USP/RXR. Since the above methods are based on the assumption that d S is constant among sites, we also used more recent models which relax this assumption and obtained results consistent with traditional random-sites models. Overall our findings support the evolution of novel function in USP/RXR of more derived holometabolous insects, and are consistent with shifts in structure and function which may have increased USP/RXR reliance on EcR for cofactor recruitment. Moreover, these findings raise important questions regarding hypotheses which suggest the independent activation of USP/RXR by its own ligand. PMID:21901121

  9. Computer Science in High School Graduation Requirements. ECS Education Trends

    ERIC Educational Resources Information Center

    Zinth, Jennifer Dounay

    2015-01-01

    Computer science and coding skills are widely recognized as a valuable asset in the current and projected job market. The Bureau of Labor Statistics projects 37.5 percent growth from 2012 to 2022 in the "computer systems design and related services" industry--from 1,620,300 jobs in 2012 to an estimated 2,229,000 jobs in 2022. Yet some…

  10. Advanced communications technology satellite high burst rate link evaluation terminal experiment control and monitor software user's guide, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1992-01-01

    The Experiment Control and Monitor (EC&M) software was developed at NASA Lewis Research Center to support the Advanced Communications Technology Satellite (ACTS) High Burst Rate Link Evaluation Terminal (HBR-LET). The HBR-LET is an experimenter's terminal to communicate with the ACTS for various investigations by government agencies, universities, and industry. The EC&M software is one segment of the Control and Performance Monitoring (C&PM) software system of the HBR-LET. The EC&M software allows users to initialize, control, and monitor the instrumentation within the HBR-LET using a predefined sequence of commands. Besides instrument control, the C&PM software system is also responsible for computer communication between the HBR-LET and the ACTS NASA Ground Station and for uplink power control of the HBR-LET to demonstrate power augmentation during rain fade events. The EC&M Software User's Guide, Version 1.0 (NASA-CR-189160) outlines the commands required to install and operate the EC&M software. Input and output file descriptions, operator commands, and error recovery procedures are discussed in the document.

  11. Cloud computing for comparative genomics

    PubMed Central

    2010-01-01

    Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems. PMID:20482786

  12. Cloud computing for comparative genomics.

    PubMed

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  13. Evaluating the Efficacy of the Cloud for Cluster Computation

    NASA Technical Reports Server (NTRS)

    Knight, David; Shams, Khawaja; Chang, George; Soderstrom, Tom

    2012-01-01

    Computing requirements vary by industry, and it follows that NASA and other research organizations have computing demands that fall outside the mainstream. While cloud computing made rapid inroads for tasks such as powering web applications, performance issues on highly distributed tasks hindered early adoption for scientific computation. One venture to address this problem is Nebula, NASA's homegrown cloud project tasked with delivering science-quality cloud computing resources. However, another industry development is Amazon's high-performance computing (HPC) instances on Elastic Cloud Compute (EC2) that promises improved performance for cluster computation. This paper presents results from a series of benchmarks run on Amazon EC2 and discusses the efficacy of current commercial cloud technology for running scientific applications across a cluster. In particular, a 240-core cluster of cloud instances achieved 2 TFLOPS on High-Performance Linpack (HPL) at 70% of theoretical computational performance. The cluster's local network also demonstrated sub-100 ?s inter-process latency with sustained inter-node throughput in excess of 8 Gbps. Beyond HPL, a real-world Hadoop image processing task from NASA's Lunar Mapping and Modeling Project (LMMP) was run on a 29 instance cluster to process lunar and Martian surface images with sizes on the order of tens of gigapixels. These results demonstrate that while not a rival of dedicated supercomputing clusters, commercial cloud technology is now a feasible option for moderately demanding scientific workloads.

  14. Regulatory RNA design through evolutionary computation and strand displacement.

    PubMed

    Rostain, William; Landrain, Thomas E; Rodrigo, Guillermo; Jaramillo, Alfonso

    2015-01-01

    The discovery and study of a vast number of regulatory RNAs in all kingdoms of life over the past decades has allowed the design of new synthetic RNAs that can regulate gene expression in vivo. Riboregulators, in particular, have been used to activate or repress gene expression. However, to accelerate and scale up the design process, synthetic biologists require computer-assisted design tools, without which riboregulator engineering will remain a case-by-case design process requiring expert attention. Recently, the design of RNA circuits by evolutionary computation and adapting strand displacement techniques from nanotechnology has proven to be suited to the automated generation of DNA sequences implementing regulatory RNA systems in bacteria. Herein, we present our method to carry out such evolutionary design and how to use it to create various types of riboregulators, allowing the systematic de novo design of genetic control systems in synthetic biology.

  15. Clinical Utility of Preoperative Computed Tomography in Patients With Endometrial Cancer.

    PubMed

    Bogani, Giorgio; Gostout, Bobbie S; Dowdy, Sean C; Multinu, Francesco; Casarin, Jvan; Cliby, William A; Frigerio, Luigi; Kim, Bohyun; Weaver, Amy L; Glaser, Gretchen E; Mariani, Andrea

    2017-10-01

    The aim of this study was to determine the clinical utility of routine preoperative pelvic and abdominal computed tomography (CT) examinations in patients with endometrial cancer (EC). We retrospectively reviewed records from patients with EC who underwent a preoperative endometrial biopsy and had surgery at our institution from January 1999 through December 2008. In the subset with an abdominal CT scan obtained within 3 months before surgery, we evaluated the clinical utility of the CT scan. Overall, 224 patients (18%) had a preoperative endometrial biopsy and an available CT scan. Gross intra-abdominal disease was observed in 10% and 20% of patients with preoperative diagnosis of endometrioid G3 and type II EC, respectively, whereas less than 5% of patients had a preoperative diagnosis of hyperplasia or low-grade EC. When examining retroperitoneal findings, we observed that a negative CT scan of the pelvis did not exclude the presence of pelvic node metastasis. Alternately, a negative CT scan in the para-aortic area generally reduced the probability of finding para-aortic dissemination but with an overall low sensitivity (42%). However, the sensitivity for para-aortic dissemination was as high as 67% in patients with G3 endometrioid cancer. In the case of negative para-aortic nodes in the CT scan, the risk of para-aortic node metastases decreased from 18.8% to 7.5% in patients with endometrioid G3 EC. Up to 15% of patients with endometrioid G3 cancer had clinically relevant incidental findings that necessitated medical or surgical intervention. In patients with endometrioid G3 and type II EC diagnosed by the preoperative biopsy, CT scans may help guide the operative plan by facilitating preoperative identification of gross intra-abdominal disease and enlarged positive para-aortic nodes that are not detectable during physical examinations. In addition, CT may reveal other clinically relevant incidental findings.

  16. Genomicus 2018: karyotype evolutionary trees and on-the-fly synteny computing

    PubMed Central

    Nguyen, Nga Thi Thuy; Vincens, Pierre

    2018-01-01

    Abstract Since 2010, the Genomicus web server is available online at http://genomicus.biologie.ens.fr/genomicus. This graphical browser provides access to comparative genomic analyses in four different phyla (Vertebrate, Plants, Fungi, and non vertebrate Metazoans). Users can analyse genomic information from extant species, as well as ancestral gene content and gene order for vertebrates and flowering plants, in an integrated evolutionary context. New analyses and visualization tools have recently been implemented in Genomicus Vertebrate. Karyotype structures from several genomes can now be compared along an evolutionary pathway (Multi-KaryotypeView), and synteny blocks can be computed and visualized between any two genomes (PhylDiagView). PMID:29087490

  17. HPC: Rent or Buy

    ERIC Educational Resources Information Center

    Fredette, Michelle

    2012-01-01

    "Rent or buy?" is a question people ask about everything from housing to textbooks. It is also a question universities must consider when it comes to high-performance computing (HPC). With the advent of Amazon's Elastic Compute Cloud (EC2), Microsoft Windows HPC Server, Rackspace's OpenStack, and other cloud-based services, researchers now have…

  18. Non-Evolutionary Algorithms for Scheduling Dependent Tasks in Distributed Heterogeneous Computing Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wayne F. Boyer; Gurdeep S. Hura

    2005-09-01

    The Problem of obtaining an optimal matching and scheduling of interdependent tasks in distributed heterogeneous computing (DHC) environments is well known to be an NP-hard problem. In a DHC system, task execution time is dependent on the machine to which it is assigned and task precedence constraints are represented by a directed acyclic graph. Recent research in evolutionary techniques has shown that genetic algorithms usually obtain more efficient schedules that other known algorithms. We propose a non-evolutionary random scheduling (RS) algorithm for efficient matching and scheduling of inter-dependent tasks in a DHC system. RS is a succession of randomized taskmore » orderings and a heuristic mapping from task order to schedule. Randomized task ordering is effectively a topological sort where the outcome may be any possible task order for which the task precedent constraints are maintained. A detailed comparison to existing evolutionary techniques (GA and PSGA) shows the proposed algorithm is less complex than evolutionary techniques, computes schedules in less time, requires less memory and fewer tuning parameters. Simulation results show that the average schedules produced by RS are approximately as efficient as PSGA schedules for all cases studied and clearly more efficient than PSGA for certain cases. The standard formulation for the scheduling problem addressed in this paper is Rm|prec|Cmax.,« less

  19. Coculturing with endothelial cells promotes in vitro maturation and electrical coupling of human embryonic stem cell-derived cardiomyocytes.

    PubMed

    Pasquier, Jennifer; Gupta, Renuka; Rioult, Damien; Hoarau-Véchot, Jessica; Courjaret, Raphael; Machaca, Khaled; Al Suwaidi, Jassim; Stanley, Edouard G; Rafii, Shahin; Elliott, David A; Abi Khalil, Charbel; Rafii, Arash

    2017-06-01

    Pluripotent human embryonic stem cells (hESC) are a promising source of repopulating cardiomyocytes. We hypothesized that we could improve maturation of cardiomyocytes and facilitate electrical interconnections by creating a model that more closely resembles heart tissue; that is, containing both endothelial cells (ECs) and cardiomyocytes. We induced cardiomyocyte differentiation in the coculture of an hESC line expressing the cardiac reporter NKX2.5-green fluorescent protein (GFP), and an Akt-activated EC line (E4 + ECs). We quantified spontaneous beating rates, synchrony, and coordination between different cardiomyocyte clusters using confocal imaging of Fura Red-detected calcium transients and computer-assisted image analysis. After 8 days in culture, 94% ± 6% of the NKX2-5GFP + cells were beating when hESCs embryonic bodies were plated on E4 + ECs compared with 34% ± 12.9% for controls consisting of hESCs cultured on BD Matrigel (BD Biosciences) without ECs at Day 11 in culture. The spatial organization of beating areas in cocultures was different. The GFP + cardiomyocytes were close to the E4 + ECs. The average beats/min of the cardiomyocytes in coculture was faster and closer to physiologic heart rates compared with controls (50 ± 14 [n = 13] vs 25 ± 9 [n = 8]; p < 0.05). The coculture with ECs led to synchronized beating relying on the endothelial network, as illustrated by the loss of synchronization upon the disruption of endothelial bridges. The coculturing of differentiating cardiomyocytes with Akt-activated ECs but not EC-conditioned media results in (1) improved efficiency of the cardiomyocyte differentiation protocol and (2) increased maturity leading to better intercellular coupling with improved chronotropy and synchrony. Copyright © 2017. Published by Elsevier Inc.

  20. Synchronous second primary cancers in patients with squamous esophageal cancer: clinical features and survival outcome.

    PubMed

    Lee, Jin Seo; Ahn, Ji Yong; Choi, Kee Don; Song, Ho June; Kim, Yong Hee; Lee, Gin Hyug; Jung, Hwoon-Yong; Ryu, Jin-Sook; Kim, Sung-Bae; Kim, Jong Hoon; Park, Seung-Il; Cho, Kyung-Ja; Kim, Jin-Ho

    2016-03-01

    Unexpected diagnosis of synchronous second primary cancers (SPC) complicates physicians' decision-making because clinical details of squamous esophageal cancer (EC) patients with SPC have been limited. We evaluated clinical features and treatment outcomes of patients with synchronous SPC detected during the initial staging of squamous EC. We identified a total of 317 consecutive patients diagnosed with squamous EC. Relevant clinical and cancer-specific information were reviewed retrospectively. EC patients with synchronous SPC were identified in 21 patients (6.6%). There were significant differences in median age (70 years vs. 63 years, p = 0.01), serum albumin level (3.3 g/dL vs. 3.9 g/dL, p < 0.01) and body mass index (20.4 kg/m(2) vs. 22.8 kg/m(2), p = 0.01) between EC patients with and without SPC. Head and neck, lung and gastric cancers accounted for 18.2%, 22.7%, and 18.2% of SPC, respectively. Positron emission tomography-computed tomography (PET-CT) detected four cases (18.2%) of SPC that were missed on CT. Management plans were altered in 13 of 21 patients (61.9%) with detected SPC. Curative esophagectomy was attempted in 28.6% of EC patients with SPC (vs. 59.1% of patients without SPC; p = 0.006). EC patients with SPC had significantly lower 5-year survival than patients without SPC (10.6% vs. 36.7%, p = 0.008). Synchronous SPC were found in 6.6% of squamous EC patients, and PET-CT contributed substantially to the detection of synchronous SPC. EC patients with SPC had poor survival due to challenges of providing stage-appropriate treatment.

  1. Performance Management of High Performance Computing for Medical Image Processing in Amazon Web Services.

    PubMed

    Bao, Shunxing; Damon, Stephen M; Landman, Bennett A; Gokhale, Aniruddha

    2016-02-27

    Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical-Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for-use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline.

  2. CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment

    NASA Astrophysics Data System (ADS)

    Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.

    2017-12-01

    Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).

  3. Performance management of high performance computing for medical image processing in Amazon Web Services

    NASA Astrophysics Data System (ADS)

    Bao, Shunxing; Damon, Stephen M.; Landman, Bennett A.; Gokhale, Aniruddha

    2016-03-01

    Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical- Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for- use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline.

  4. Performance Management of High Performance Computing for Medical Image Processing in Amazon Web Services

    PubMed Central

    Bao, Shunxing; Damon, Stephen M.; Landman, Bennett A.; Gokhale, Aniruddha

    2016-01-01

    Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical-Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for-use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline. PMID:27127335

  5. Avoiding Local Optima with Interactive Evolutionary Robotics

    DTIC Science & Technology

    2012-07-09

    the top of a flight of stairs selects for climbing ; suspending the robot and the target object above the ground and creating rungs between the two will...REPORT Avoiding Local Optimawith Interactive Evolutionary Robotics 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: The main bottleneck in evolutionary... robotics has traditionally been the time required to evolve robot controllers. However with the continued acceleration in computational resources, the

  6. Unfolding of a ClC chloride transporter retains memory of its evolutionary history.

    PubMed

    Min, Duyoung; Jefferson, Robert E; Qi, Yifei; Wang, Jing Yang; Arbing, Mark A; Im, Wonpil; Bowie, James U

    2018-05-01

    ClC chloride channels and transporters are important for chloride homeostasis in species from bacteria to human. Mutations in ClC proteins cause genetically inherited diseases, some of which are likely to involve folding defects. The ClC proteins present a challenging and unusual biological folding problem because they are large membrane proteins possessing a complex architecture, with many reentrant helices that go only partway through membrane and loop back out. Here we were able to examine the unfolding of the Escherichia coli ClC transporter, ClC-ec1, using single-molecule forced unfolding methods. We found that the protein could be separated into two stable halves that unfolded independently. The independence of the two domains is consistent with an evolutionary model in which the two halves arose from independently folding subunits that later fused together. Maintaining smaller folding domains of lesser complexity within large membrane proteins may be an advantageous strategy to avoid misfolding traps.

  7. Optimality and stability of symmetric evolutionary games with applications in genetic selection.

    PubMed

    Huang, Yuanyuan; Hao, Yiping; Wang, Min; Zhou, Wen; Wu, Zhijun

    2015-06-01

    Symmetric evolutionary games, i.e., evolutionary games with symmetric fitness matrices, have important applications in population genetics, where they can be used to model for example the selection and evolution of the genotypes of a given population. In this paper, we review the theory for obtaining optimal and stable strategies for symmetric evolutionary games, and provide some new proofs and computational methods. In particular, we review the relationship between the symmetric evolutionary game and the generalized knapsack problem, and discuss the first and second order necessary and sufficient conditions that can be derived from this relationship for testing the optimality and stability of the strategies. Some of the conditions are given in different forms from those in previous work and can be verified more efficiently. We also derive more efficient computational methods for the evaluation of the conditions than conventional approaches. We demonstrate how these conditions can be applied to justifying the strategies and their stabilities for a special class of genetic selection games including some in the study of genetic disorders.

  8. Operating Dedicated Data Centers - Is It Cost-Effective?

    NASA Astrophysics Data System (ADS)

    Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.

    2014-06-01

    The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.

  9. Computed tomography scan in supine and prone positions: an alternative method to detect intramural gas in emphysematous cystitis and to evaluate efficacy after adjuvant continuous intravesical irrigation treatment.

    PubMed

    Cortés-González, Jeff R; Ortiz-Lara, Gerardo E; Salinas, Matías; Hernández-Galván, Fernando; Gómez-Guerra, Lauro S

    2013-04-01

    To evaluate the efficacy of continuous intravesical irrigation with saline plus amikacin as adjuvant therapy and to evaluate the computed tomography (CT) scan in supine and prone positions (CystoCT scan) as an alternative diagnostic and evaluation method of intramural gas in emphysematous cystitis (EC) before and after treatment. Consecutive patients with a diagnosis of EC who were hospitalized between March 2006 and January 2011 were investigated. The diagnosis was made by CystoCT scan. Treatment consisted of intravenous antibiotics, control of concomitant diseases, and placement of a 3-way urinary catheter for continuous irrigation of 500 mg of amikacin diluted in 1 l of saline given on days 0, 3, and 7. Treatment was considered successful when there was an absence of gas in the bladder wall, the urine culture was negative, there was clinical improvement, and there was an absence of toxicity. Eleven patients were hospitalized with a diagnosis of EC during the study period. Four were excluded from the study, 2 due to the lack of confirmation of the diagnosis with the CystoCT scan. Treatment was successful in all patients; for 6 (86%) this was achieved in 3 days and for 1 (14%) in 7 days. No toxicity was reported. Continuous intravesical irrigation with saline plus amikacin as adjuvant treatment of EC is an inexpensive, effective, and safe tool that might help conventional treatment and provide a rapid recovery. The CystoCT scan is an alternative method to diagnose and evaluate intramural gas in EC patients. These findings should be challenged in a randomized, multi-centre, placebo-controlled clinical trial.

  10. Evolutionary inference via the Poisson Indel Process.

    PubMed

    Bouchard-Côté, Alexandre; Jordan, Michael I

    2013-01-22

    We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114-124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments.

  11. Evolutionary inference via the Poisson Indel Process

    PubMed Central

    Bouchard-Côté, Alexandre; Jordan, Michael I.

    2013-01-01

    We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114–124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments. PMID:23275296

  12. VizieR Online Data Catalog: Comparison of evolutionary tracks (Martins+, 2013)

    NASA Astrophysics Data System (ADS)

    Martins, F.; Palacios, A.

    2013-11-01

    Tables of evolutionary models for massive stars. The files m*_stol.dat correspond to models computed with the code STAREVOL. The files m*_mesa.dat correspond to models computed with the code MESA. For each code, models with initial masses equal to 7, 9, 15, 20, 25, 40 and 60M⊙ are provided. No rotation is included. The overshooting parameter f is equal to 0.01. The metallicity is solar. (14 data files).

  13. Differential diagnosis of CT focal liver lesions using texture features, feature selection and ensemble driven classifiers.

    PubMed

    Mougiakakou, Stavroula G; Valavanis, Ioannis K; Nikita, Alexandra; Nikita, Konstantina S

    2007-09-01

    The aim of the present study is to define an optimally performing computer-aided diagnosis (CAD) architecture for the classification of liver tissue from non-enhanced computed tomography (CT) images into normal liver (C1), hepatic cyst (C2), hemangioma (C3), and hepatocellular carcinoma (C4). To this end, various CAD architectures, based on texture features and ensembles of classifiers (ECs), are comparatively assessed. Number of regions of interests (ROIs) corresponding to C1-C4 have been defined by experienced radiologists in non-enhanced liver CT images. For each ROI, five distinct sets of texture features were extracted using first order statistics, spatial gray level dependence matrix, gray level difference method, Laws' texture energy measures, and fractal dimension measurements. Two different ECs were constructed and compared. The first one consists of five multilayer perceptron neural networks (NNs), each using as input one of the computed texture feature sets or its reduced version after genetic algorithm-based feature selection. The second EC comprised five different primary classifiers, namely one multilayer perceptron NN, one probabilistic NN, and three k-nearest neighbor classifiers, each fed with the combination of the five texture feature sets or their reduced versions. The final decision of each EC was extracted by using appropriate voting schemes, while bootstrap re-sampling was utilized in order to estimate the generalization ability of the CAD architectures based on the available relatively small-sized data set. The best mean classification accuracy (84.96%) is achieved by the second EC using a fused feature set, and the weighted voting scheme. The fused feature set was obtained after appropriate feature selection applied to specific subsets of the original feature set. The comparative assessment of the various CAD architectures shows that combining three types of classifiers with a voting scheme, fed with identical feature sets obtained after appropriate feature selection and fusion, may result in an accurate system able to assist differential diagnosis of focal liver lesions from non-enhanced CT images.

  14. NexGen PVAs: Incorporating Eco-Evolutionary Processes into Population Viability Models

    EPA Science Inventory

    We examine how the integration of evolutionary and ecological processes in population dynamics – an emerging framework in ecology – could be incorporated into population viability analysis (PVA). Driven by parallel, complementary advances in population genomics and computational ...

  15. Merging molecular mechanism and evolution: theory and computation at the interface of biophysics and evolutionary population genetics

    PubMed Central

    Serohijos, Adrian W.R.; Shakhnovich, Eugene I.

    2014-01-01

    The variation among sequences and structures in nature is both determined by physical laws and by evolutionary history. However, these two factors are traditionally investigated by disciplines with different emphasis and philosophy—molecular biophysics on one hand and evolutionary population genetics in another. Here, we review recent theoretical and computational approaches that address the critical need to integrate these two disciplines. We first articulate the elements of these integrated approaches. Then, we survey their contribution to our mechanistic understanding of molecular evolution, the polymorphisms in coding region, the distribution of fitness effects (DFE) of mutations, the observed folding stability of proteins in nature, and the distribution of protein folds in genomes. PMID:24952216

  16. Merging molecular mechanism and evolution: theory and computation at the interface of biophysics and evolutionary population genetics.

    PubMed

    Serohijos, Adrian W R; Shakhnovich, Eugene I

    2014-06-01

    The variation among sequences and structures in nature is both determined by physical laws and by evolutionary history. However, these two factors are traditionally investigated by disciplines with different emphasis and philosophy-molecular biophysics on one hand and evolutionary population genetics in another. Here, we review recent theoretical and computational approaches that address the crucial need to integrate these two disciplines. We first articulate the elements of these approaches. Then, we survey their contribution to our mechanistic understanding of molecular evolution, the polymorphisms in coding region, the distribution of fitness effects (DFE) of mutations, the observed folding stability of proteins in nature, and the distribution of protein folds in genomes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Evolutionary trends in directional hearing

    PubMed Central

    Carr, Catherine E.; Christensen-Dalsgaard, Jakob

    2016-01-01

    Tympanic hearing is a true evolutionary novelty that arose in parallel within early tetrapods. We propose that in these tetrapods, selection for sound localization in air acted upon pre-existing directionally sensitive brainstem circuits, similar to those in fishes. Auditory circuits in birds and lizards resemble this ancestral, directionally sensitive framework. Despite this anatomically similarity, coding of sound source location differs between birds and lizards. In birds, brainstem circuits compute sound location from interaural cues. Lizards, however, have coupled ears, and do not need to compute source location in the brain. Thus their neural processing of sound direction differs, although all show mechanisms for enhancing sound source directionality. Comparisons with mammals reveal similarly complex interactions between coding strategies and evolutionary history. PMID:27448850

  18. Genomicus 2018: karyotype evolutionary trees and on-the-fly synteny computing.

    PubMed

    Nguyen, Nga Thi Thuy; Vincens, Pierre; Roest Crollius, Hugues; Louis, Alexandra

    2018-01-04

    Since 2010, the Genomicus web server is available online at http://genomicus.biologie.ens.fr/genomicus. This graphical browser provides access to comparative genomic analyses in four different phyla (Vertebrate, Plants, Fungi, and non vertebrate Metazoans). Users can analyse genomic information from extant species, as well as ancestral gene content and gene order for vertebrates and flowering plants, in an integrated evolutionary context. New analyses and visualization tools have recently been implemented in Genomicus Vertebrate. Karyotype structures from several genomes can now be compared along an evolutionary pathway (Multi-KaryotypeView), and synteny blocks can be computed and visualized between any two genomes (PhylDiagView). © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Phylogenetic tree and community structure from a Tangled Nature model.

    PubMed

    Canko, Osman; Taşkın, Ferhat; Argın, Kamil

    2015-10-07

    In evolutionary biology, the taxonomy and origination of species are widely studied subjects. An estimation of the evolutionary tree can be done via available DNA sequence data. The calculation of the tree is made by well-known and frequently used methods such as maximum likelihood and neighbor-joining. In order to examine the results of these methods, an evolutionary tree is pursued computationally by a mathematical model, called Tangled Nature. A relatively small genome space is investigated due to computational burden and it is found that the actual and predicted trees are in reasonably good agreement in terms of shape. Moreover, the speciation and the resulting community structure of the food-web are investigated by modularity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Computer-Aided Construction at Designing Reinforced Concrete Columns as Per Ec

    NASA Astrophysics Data System (ADS)

    Zielińska, M.; Grębowski, K.

    2015-02-01

    The article presents the authors' computer program for designing and dimensioning columns in reinforced concrete structures taking into account phenomena affecting their behaviour and information referring to design as per EC. The computer program was developed with the use of C++ programming language. The program guides the user through particular dimensioning stages: from introducing basic data such as dimensions, concrete class, reinforcing steel class and forces affecting the column, through calculating the creep coefficient taking into account the impact of imperfection depending on the support scheme and also the number of mating members at load shit, buckling length, to generating the interaction curve graph. The final result of calculations provides two dependence points calculated as per methods of nominal stiffness and nominal curvature. The location of those points relative to the limit curve determines whether the column load capacity is assured or has been exceeded. The content of the study describes in detail the operation of the computer program and the methodology and phenomena which are indispensable at designing axially and eccentrically the compressed members of reinforced concrete structures as per the European standards.

  1. Use of the self-organizing feature map to diagnose abnormal engineering change

    NASA Astrophysics Data System (ADS)

    Lu, Ruei-Shan; Wu, Zhi-Ting; Peng, Kuo-Wei; Yu, Tai-Yi

    2015-07-01

    This study established identification manners with self-organizing feature map (SOM) to achieve the goal of monitoring Engineering Change (EC) based on historical data of a company that specializes in computers and peripherals. The product life cycle of this company is 3-6 months. The historical data were divided into three parts, each covering four months. The first part, comprising 2,343 records from January to April (the training period), comprise the Control Group. The second and third parts comprise Experimental Groups (EG) 1 and 2, respectively. For EG 1 and 2, the successful rate of recognizing information on abnormal ECs was approximately 96% and 95%, respectively. This paper shows the importance and screening procedures of abnormal engineering change for a particular company specializing in computers and peripherals.

  2. High-frequency CAD-based scattering model: SERMAT

    NASA Astrophysics Data System (ADS)

    Goupil, D.; Boutillier, M.

    1991-09-01

    Specifications for an industrial radar cross section (RCS) calculation code are given: it must be able to exchange data with many computer aided design (CAD) systems, it must be fast, and it must have powerful graphic tools. Classical physical optics (PO) and equivalent currents (EC) techniques have proven their efficiency on simple objects for a long time. Difficult geometric problems occur when objects with very complex shapes have to be computed. Only a specific geometric code can solve these problems. We have established that, once these problems have been solved: (1) PO and EC give good results on complex objects of large size compared to wavelength; and (2) the implementation of these objects in a software package (SERMAT) allows fast and sufficiently precise domain RCS calculations to meet industry requirements in the domain of stealth.

  3. Scalable computing for evolutionary genomics.

    PubMed

    Prins, Pjotr; Belhachemi, Dominique; Möller, Steffen; Smant, Geert

    2012-01-01

    Genomic data analysis in evolutionary biology is becoming so computationally intensive that analysis of multiple hypotheses and scenarios takes too long on a single desktop computer. In this chapter, we discuss techniques for scaling computations through parallelization of calculations, after giving a quick overview of advanced programming techniques. Unfortunately, parallel programming is difficult and requires special software design. The alternative, especially attractive for legacy software, is to introduce poor man's parallelization by running whole programs in parallel as separate processes, using job schedulers. Such pipelines are often deployed on bioinformatics computer clusters. Recent advances in PC virtualization have made it possible to run a full computer operating system, with all of its installed software, on top of another operating system, inside a "box," or virtual machine (VM). Such a VM can flexibly be deployed on multiple computers, in a local network, e.g., on existing desktop PCs, and even in the Cloud, to create a "virtual" computer cluster. Many bioinformatics applications in evolutionary biology can be run in parallel, running processes in one or more VMs. Here, we show how a ready-made bioinformatics VM image, named BioNode, effectively creates a computing cluster, and pipeline, in a few steps. This allows researchers to scale-up computations from their desktop, using available hardware, anytime it is required. BioNode is based on Debian Linux and can run on networked PCs and in the Cloud. Over 200 bioinformatics and statistical software packages, of interest to evolutionary biology, are included, such as PAML, Muscle, MAFFT, MrBayes, and BLAST. Most of these software packages are maintained through the Debian Med project. In addition, BioNode contains convenient configuration scripts for parallelizing bioinformatics software. Where Debian Med encourages packaging free and open source bioinformatics software through one central project, BioNode encourages creating free and open source VM images, for multiple targets, through one central project. BioNode can be deployed on Windows, OSX, Linux, and in the Cloud. Next to the downloadable BioNode images, we provide tutorials online, which empower bioinformaticians to install and run BioNode in different environments, as well as information for future initiatives, on creating and building such images.

  4. Towards a continuous glucose monitoring system using tunable quantum cascade lasers

    NASA Astrophysics Data System (ADS)

    Haase, Katharina; Müller, Niklas; Petrich, Wolfgang

    2018-02-01

    We present a reagent-free approach for long-term continuous glucose monitoring (cgm) of liquid samples using midinfrared absorption spectroscopy. This method could constitute an alternative to enzymatic glucose sensors in order to manage the widespread disease of Diabetes. In order to acquire spectra of the liquid specimen, we use a spectrally tunable external-cavity (EC-) quantum cascade laser (QCL) as radiation source in combination with a fiber-based in vitro sensor setup. Hereby we achieve a glucose sensitivity in pure glucose solutions of 3 mg/dL (RMSEP). Furthermore, the spectral tunability of the EC-QCL enables us to discriminate glucose from other molecules. We exemplify this by detecting glucose among other saccharides with an accuracy of 8 mg/dL (within other monosaccharides, RMSEVC) and 14 mg/dL (within other mono- and disaccharides, RMSECV). Moreover, we demonstrate a characterization of the significance of each wavenumber for an accurate prediction of glucose among other saccharides using an evolutionary algorithm. We show, that by picking 10 distinct wavenumbers we can achieve comparable accuracies to the use of a complete spectrum.

  5. Improvement of the Performance of an Electrocoagulation Process System Using Fuzzy Control of pH.

    PubMed

    Demirci, Yavuz; Pekel, Lutfiye Canan; Altinten, Ayla; Alpbaz, Mustafa

    2015-12-01

    The removal efficiencies of electrocoagulation (EC) systems are highly dependent on the initial value of pH. If an EC system has an acidic influent, the pH of the effluent increases during the treatment process; conversely, if such a system has an alkaline influent, the pH of the effluent decreases during the treatment process. Thus, changes in the pH of the wastewater affect the efficiency of the EC process. In this study, we investigated the dynamic effects of pH. To evaluate approaches for preventing increases in the pH of the system, the MATLAB/Simulink program was used to develop and evaluate an on-line computer-based system for pH control. The aim of this work was to study Proportional-Integral-Derivative (PID) control and fuzzy control of the pH of a real textile wastewater purification process using EC. The performances and dynamic behaviors of these two control systems were evaluated based on determinations of COD, colour, and turbidity removal efficiencies.

  6. Multiplexed HTS rf SQUID magnetometer array for eddy current testing of aircraft rivet joints

    NASA Astrophysics Data System (ADS)

    Gärtner, S.; Krause, H.-J.; Wolters, N.; Lomparski, D.; Wolf, W.; Schubert, J.; Kreutzbruck, M. v.; Allweins, K.

    2002-05-01

    Using three rf SQUID magnetometers, a multiplexed SQUID array was implemented. The SQUIDs are positioned in line with 7 mm spacing and operated using one feedback electronics with sequential read out demodulation at different radio frequencies (rf). The cross-talk between SQUID channels was determined to be negligible. To show the performance of the SQUID array, eddy current (EC) measurements of aluminum aircraft samples in conjunction with a differential (double-D) EC excitation and lock-in readout were carried out. With computer-controlled continuous switching of the SQUIDs during the scan, three EC signal traces of the sample are obtained simultaneously. We performed measurements with an EC excitation frequency of 135 Hz to localize an artificial crack (sawcut flaw) of 20 mm length in an aluminum sheet with 0.6 mm thickness. The flaw was still detected when covered with aluminum of up to 10 mm thickness. In addition, measurements with varying angles between scanning direction and flaw orientation are presented.

  7. Precise on-machine extraction of the surface normal vector using an eddy current sensor array

    NASA Astrophysics Data System (ADS)

    Wang, Yongqing; Lian, Meng; Liu, Haibo; Ying, Yangwei; Sheng, Xianjun

    2016-11-01

    To satisfy the requirements of on-machine measurement of the surface normal during complex surface manufacturing, a highly robust normal vector extraction method using an Eddy current (EC) displacement sensor array is developed, the output of which is almost unaffected by surface brightness, machining coolant and environmental noise. A precise normal vector extraction model based on a triangular-distributed EC sensor array is first established. Calibration of the effects of object surface inclination and coupling interference on measurement results, and the relative position of EC sensors, is involved. A novel apparatus employing three EC sensors and a force transducer was designed, which can be easily integrated into the computer numerical control (CNC) machine tool spindle and/or robot terminal execution. Finally, to test the validity and practicability of the proposed method, typical experiments were conducted with specified testing pieces using the developed approach and system, such as an inclined plane and cylindrical and spherical surfaces.

  8. Li+ solvation and kinetics of Li+-BF4-/PF6- ion pairs in ethylene carbonate. A molecular dynamics study with classical rate theories

    NASA Astrophysics Data System (ADS)

    Chang, Tsun-Mei; Dang, Liem X.

    2017-10-01

    Using our polarizable force-field models and employing classical rate theories of chemical reactions, we examine the ethylene carbonate (EC) exchange process between the first and second solvation shells around Li+ and the dissociation kinetics of ion pairs Li+-[BF4] and Li+-[PF6] in this solvent. We calculate the exchange rates using transition state theory and correct them with transmission coefficients computed by the reactive flux, Impey, Madden, and McDonald approaches, and Grote-Hynes theory. We found that the residence times of EC around Li+ ions varied from 60 to 450 ps, depending on the correction method used. We found that the relaxation times changed significantly from Li+-[BF4] to Li+-[PF6] ion pairs in EC. Our results also show that, in addition to affecting the free energy of dissociation in EC, the anion type also significantly influences the dissociation kinetics of ion pairing.

  9. Design and control of a prosthetic leg for above-knee amputees operated in semi-active and active modes

    NASA Astrophysics Data System (ADS)

    Park, Jinhyuk; Yoon, Gun-Ha; Kang, Je-Won; Choi, Seung-Bok

    2016-08-01

    This paper proposes a new prosthesis operated in two different modes; the semi-active and active modes. The semi-active mode is achieved from a flow mode magneto-rheological (MR) damper, while the active mode is obtained from an electronically commutated (EC) motor. The knee joint part of the above knee prosthesis is equipped with the MR damper and EC motor. The MR damper generates reaction force by controlling the field-dependent yield stress of the MR fluid, while the EC motor actively controls the knee joint angle during gait cycle. In this work, the MR damper is designed as a two-end type flow mode mechanism without air chamber for compact size. On other hand, in order to predict desired knee joint angle to be controlled by EC motor, a polynomial prediction function using a statistical method is used. A nonlinear proportional-derivative controller integrated with the computed torque method is then designed and applied to both MR damper and EC motor to control the knee joint angle. It is demonstrated that the desired knee joint angle is well achieved in different walking velocities on the ground ground.

  10. MEGA-CC: computing core of molecular evolutionary genetics analysis program for automated and iterative data analysis.

    PubMed

    Kumar, Sudhir; Stecher, Glen; Peterson, Daniel; Tamura, Koichiro

    2012-10-15

    There is a growing need in the research community to apply the molecular evolutionary genetics analysis (MEGA) software tool for batch processing a large number of datasets and to integrate it into analysis workflows. Therefore, we now make available the computing core of the MEGA software as a stand-alone executable (MEGA-CC), along with an analysis prototyper (MEGA-Proto). MEGA-CC provides users with access to all the computational analyses available through MEGA's graphical user interface version. This includes methods for multiple sequence alignment, substitution model selection, evolutionary distance estimation, phylogeny inference, substitution rate and pattern estimation, tests of natural selection and ancestral sequence inference. Additionally, we have upgraded the source code for phylogenetic analysis using the maximum likelihood methods for parallel execution on multiple processors and cores. Here, we describe MEGA-CC and outline the steps for using MEGA-CC in tandem with MEGA-Proto for iterative and automated data analysis. http://www.megasoftware.net/.

  11. Cogeneration Technology Alternatives Study (CTAS). Volume 6: Computer data. Part 1: Coal-fired nocogeneration process boiler, section A

    NASA Technical Reports Server (NTRS)

    Knightly, W. F.

    1980-01-01

    Various advanced energy conversion systems (ECS) are compared with each other and with current technology systems for their savings in fuel energy, costs, and emissions in individual plants and on a national level. About fifty industrial processes from the largest energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidates which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum based residual and distillate liquid fuels, and low Btu gas obtained through the on-site gasification of coal. Computer generated reports of the fuel consumption and savings, capital costs, economics and emissions of the cogeneration energy conversion systems (ECS's) heat and power matched to the individual industrial processes are presented for coal fired process boilers. National fuel and emissions savings are also reported for each ECS assuming it alone is implemented.

  12. Novel approach for computing photosynthetically active radiation for productivity modeling using remotely sensed images in the Great Plains, United States

    USGS Publications Warehouse

    Singh, Ramesh K.; Liu, Shu-Guang; Tieszen, Larry L.; Suyker, Andrew E.; Verma, Shashi B.

    2012-01-01

    Gross primary production (GPP) is a key indicator of ecosystem performance, and helps in many decision-making processes related to environment. We used the Eddy covariancelight use efficiency (EC-LUE) model for estimating GPP in the Great Plains, United States in order to evaluate the performance of this model. We developed a novel algorithm for computing the photosynthetically active radiation (PAR) based on net radiation. A strong correlation (R2=0.94,N=24) was found between daily PAR and Landsat-based mid-day instantaneous net radiation. Though the Moderate Resolution Spectroradiometer (MODIS) based instantaneous net radiation was in better agreement (R2=0.98,N=24) with the daily measured PAR, there was no statistical significant difference between Landsat based PAR and MODIS based PAR. The EC-LUE model validation also confirms the need to consider biological attributes (C3 versus C4 plants) for potential light use efficiency. A universal potential light use efficiency is unable to capture the spatial variation of GPP. It is necessary to use C3 versus C4 based land use/land cover map for using EC-LUE model for estimating spatiotemporal distribution of GPP.

  13. Hybrid Pluggable Processing Pipeline (HyP3): A cloud-based infrastructure for generic processing of SAR data

    NASA Astrophysics Data System (ADS)

    Hogenson, K.; Arko, S. A.; Buechler, B.; Hogenson, R.; Herrmann, J.; Geiger, A.

    2016-12-01

    A problem often faced by Earth science researchers is how to scale algorithms that were developed against few datasets and take them to regional or global scales. One significant hurdle can be the processing and storage resources available for such a task, not to mention the administration of those resources. As a processing environment, the cloud offers nearly unlimited potential for compute and storage, with limited administration required. The goal of the Hybrid Pluggable Processing Pipeline (HyP3) project was to demonstrate the utility of the Amazon cloud to process large amounts of data quickly and cost effectively, while remaining generic enough to incorporate new algorithms with limited administration time or expense. Principally built by three undergraduate students at the ASF DAAC, the HyP3 system relies on core Amazon services such as Lambda, the Simple Notification Service (SNS), Relational Database Service (RDS), Elastic Compute Cloud (EC2), Simple Storage Service (S3), and Elastic Beanstalk. The HyP3 user interface was written using elastic beanstalk, and the system uses SNS and Lamdba to handle creating, instantiating, executing, and terminating EC2 instances automatically. Data are sent to S3 for delivery to customers and removed using standard data lifecycle management rules. In HyP3 all data processing is ephemeral; there are no persistent processes taking compute and storage resources or generating added cost. When complete, HyP3 will leverage the automatic scaling up and down of EC2 compute power to respond to event-driven demand surges correlated with natural disaster or reprocessing efforts. Massive simultaneous processing within EC2 will be able match the demand spike in ways conventional physical computing power never could, and then tail off incurring no costs when not needed. This presentation will focus on the development techniques and technologies that were used in developing the HyP3 system. Data and process flow will be shown, highlighting the benefits of the cloud for each step. Finally, the steps for integrating a new processing algorithm will be demonstrated. This is the true power of HyP3; allowing people to upload their own algorithms and execute them at archive level scales.

  14. A finite element analysis modeling tool for solid oxide fuel cell development: coupled electrochemistry, thermal and flow analysis in MARC®

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khaleel, Mohammad A.; Lin, Zijing; Singh, Prabhakar

    2004-05-03

    A 3D simulation tool for modeling solid oxide fuel cells is described. The tool combines the versatility and efficiency of a commercial finite element analysis code, MARC{reg_sign}, with an in-house developed robust and flexible electrochemical (EC) module. Based upon characteristic parameters obtained experimentally and assigned by the user, the EC module calculates the current density distribution, heat generation, and fuel and oxidant species concentration, taking the temperature profile provided by MARC{reg_sign} and operating conditions such as the fuel and oxidant flow rate and the total stack output voltage or current as the input. MARC{reg_sign} performs flow and thermal analyses basedmore » on the initial and boundary thermal and flow conditions and the heat generation calculated by the EC module. The main coupling between MARC{reg_sign} and EC is for MARC{reg_sign} to supply the temperature field to EC and for EC to give the heat generation profile to MARC{reg_sign}. The loosely coupled, iterative scheme is advantageous in terms of memory requirement, numerical stability and computational efficiency. The coupling is iterated to self-consistency for a steady-state solution. Sample results for steady states as well as the startup process for stacks with different flow designs are presented to illustrate the modeling capability and numerical performance characteristic of the simulation tool.« less

  15. On joint subtree distributions under two evolutionary models.

    PubMed

    Wu, Taoyang; Choi, Kwok Pui

    2016-04-01

    In population and evolutionary biology, hypotheses about micro-evolutionary and macro-evolutionary processes are commonly tested by comparing the shape indices of empirical evolutionary trees with those predicted by neutral models. A key ingredient in this approach is the ability to compute and quantify distributions of various tree shape indices under random models of interest. As a step to meet this challenge, in this paper we investigate the joint distribution of cherries and pitchforks (that is, subtrees with two and three leaves) under two widely used null models: the Yule-Harding-Kingman (YHK) model and the proportional to distinguishable arrangements (PDA) model. Based on two novel recursive formulae, we propose a dynamic approach to numerically compute the exact joint distribution (and hence the marginal distributions) for trees of any size. We also obtained insights into the statistical properties of trees generated under these two models, including a constant correlation between the cherry and the pitchfork distributions under the YHK model, and the log-concavity and unimodality of the cherry distributions under both models. In addition, we show that there exists a unique change point for the cherry distributions between these two models. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Galaxy CloudMan: delivering cloud compute clusters.

    PubMed

    Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James

    2010-12-21

    Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.

  17. Evolutionary Computing Methods for Spectral Retrieval

    NASA Technical Reports Server (NTRS)

    Terrile, Richard; Fink, Wolfgang; Huntsberger, Terrance; Lee, Seugwon; Tisdale, Edwin; VonAllmen, Paul; Tinetti, Geivanna

    2009-01-01

    A methodology for processing spectral images to retrieve information on underlying physical, chemical, and/or biological phenomena is based on evolutionary and related computational methods implemented in software. In a typical case, the solution (the information that one seeks to retrieve) consists of parameters of a mathematical model that represents one or more of the phenomena of interest. The methodology was developed for the initial purpose of retrieving the desired information from spectral image data acquired by remote-sensing instruments aimed at planets (including the Earth). Examples of information desired in such applications include trace gas concentrations, temperature profiles, surface types, day/night fractions, cloud/aerosol fractions, seasons, and viewing angles. The methodology is also potentially useful for retrieving information on chemical and/or biological hazards in terrestrial settings. In this methodology, one utilizes an iterative process that minimizes a fitness function indicative of the degree of dissimilarity between observed and synthetic spectral and angular data. The evolutionary computing methods that lie at the heart of this process yield a population of solutions (sets of the desired parameters) within an accuracy represented by a fitness-function value specified by the user. The evolutionary computing methods (ECM) used in this methodology are Genetic Algorithms and Simulated Annealing, both of which are well-established optimization techniques and have also been described in previous NASA Tech Briefs articles. These are embedded in a conceptual framework, represented in the architecture of the implementing software, that enables automatic retrieval of spectral and angular data and analysis of the retrieved solutions for uniqueness.

  18. PhyloDet: a scalable visualization tool for mapping multiple traits to large evolutionary trees

    PubMed Central

    Lee, Bongshin; Nachmanson, Lev; Robertson, George; Carlson, Jonathan M.; Heckerman, David

    2009-01-01

    Summary: Evolutionary biologists are often interested in finding correlations among biological traits across a number of species, as such correlations may lead to testable hypotheses about the underlying function. Because some species are more closely related than others, computing and visualizing these correlations must be done in the context of the evolutionary tree that relates species. In this note, we introduce PhyloDet (short for PhyloDetective), an evolutionary tree visualization tool that enables biologists to visualize multiple traits mapped to the tree. Availability: http://research.microsoft.com/cue/phylodet/ Contact: bongshin@microsoft.com. PMID:19633096

  19. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    PubMed

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  20. Benchmarking Undedicated Cloud Computing Providers for Analysis of Genomic Datasets

    PubMed Central

    Yazar, Seyhan; Gooden, George E. C.; Mackey, David A.; Hewitt, Alex W.

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5–78.2) for E.coli and 53.5% (95% CI: 34.4–72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5–303.1) and 173.9% (95% CI: 134.6–213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE. PMID:25247298

  1. Network-level architecture and the evolutionary potential of underground metabolism.

    PubMed

    Notebaart, Richard A; Szappanos, Balázs; Kintses, Bálint; Pál, Ferenc; Györkei, Ádám; Bogos, Balázs; Lázár, Viktória; Spohn, Réka; Csörgő, Bálint; Wagner, Allon; Ruppin, Eytan; Pál, Csaba; Papp, Balázs

    2014-08-12

    A central unresolved issue in evolutionary biology is how metabolic innovations emerge. Low-level enzymatic side activities are frequent and can potentially be recruited for new biochemical functions. However, the role of such underground reactions in adaptation toward novel environments has remained largely unknown and out of reach of computational predictions, not least because these issues demand analyses at the level of the entire metabolic network. Here, we provide a comprehensive computational model of the underground metabolism in Escherichia coli. Most underground reactions are not isolated and 45% of them can be fully wired into the existing network and form novel pathways that produce key precursors for cell growth. This observation allowed us to conduct an integrated genome-wide in silico and experimental survey to characterize the evolutionary potential of E. coli to adapt to hundreds of nutrient conditions. We revealed that underground reactions allow growth in new environments when their activity is increased. We estimate that at least ∼20% of the underground reactions that can be connected to the existing network confer a fitness advantage under specific environments. Moreover, our results demonstrate that the genetic basis of evolutionary adaptations via underground metabolism is computationally predictable. The approach used here has potential for various application areas from bioengineering to medical genetics.

  2. Toward a web-based real-time radiation treatment planning system in a cloud computing environment.

    PubMed

    Na, Yong Hum; Suh, Tae-Suk; Kapp, Daniel S; Xing, Lei

    2013-09-21

    To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an 'on-demand' basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture's constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm(2)) from the Varian TrueBeam(TM) STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are identical to PC-based IMRT and VMAT plans, confirming the reliability of the cloud computing platform. This cloud computing infrastructure has been established for a radiation treatment planning. It substantially improves the speed of inverse planning and makes future on-treatment adaptive re-planning possible.

  3. Toward a web-based real-time radiation treatment planning system in a cloud computing environment

    NASA Astrophysics Data System (ADS)

    Hum Na, Yong; Suh, Tae-Suk; Kapp, Daniel S.; Xing, Lei

    2013-09-01

    To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an ‘on-demand’ basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture’s constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm2) from the Varian TrueBeamTM STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are identical to PC-based IMRT and VMAT plans, confirming the reliability of the cloud computing platform. This cloud computing infrastructure has been established for a radiation treatment planning. It substantially improves the speed of inverse planning and makes future on-treatment adaptive re-planning possible.

  4. DomSign: a top-down annotation pipeline to enlarge enzyme space in the protein universe.

    PubMed

    Wang, Tianmin; Mori, Hiroshi; Zhang, Chong; Kurokawa, Ken; Xing, Xin-Hui; Yamada, Takuji

    2015-03-21

    Computational predictions of catalytic function are vital for in-depth understanding of enzymes. Because several novel approaches performing better than the common BLAST tool are rarely applied in research, we hypothesized that there is a large gap between the number of known annotated enzymes and the actual number in the protein universe, which significantly limits our ability to extract additional biologically relevant functional information from the available sequencing data. To reliably expand the enzyme space, we developed DomSign, a highly accurate domain signature-based enzyme functional prediction tool to assign Enzyme Commission (EC) digits. DomSign is a top-down prediction engine that yields results comparable, or superior, to those from many benchmark EC number prediction tools, including BLASTP, when a homolog with an identity >30% is not available in the database. Performance tests showed that DomSign is a highly reliable enzyme EC number annotation tool. After multiple tests, the accuracy is thought to be greater than 90%. Thus, DomSign can be applied to large-scale datasets, with the goal of expanding the enzyme space with high fidelity. Using DomSign, we successfully increased the percentage of EC-tagged enzymes from 12% to 30% in UniProt-TrEMBL. In the Kyoto Encyclopedia of Genes and Genomes bacterial database, the percentage of EC-tagged enzymes for each bacterial genome could be increased from 26.0% to 33.2% on average. Metagenomic mining was also efficient, as exemplified by the application of DomSign to the Human Microbiome Project dataset, recovering nearly one million new EC-labeled enzymes. Our results offer preliminarily confirmation of the existence of the hypothesized huge number of "hidden enzymes" in the protein universe, the identification of which could substantially further our understanding of the metabolisms of diverse organisms and also facilitate bioengineering by providing a richer enzyme resource. Furthermore, our results highlight the necessity of using more advanced computational tools than BLAST in protein database annotations to extract additional biologically relevant functional information from the available biological sequences.

  5. An active learning approach for rapid characterization of endothelial cells in human tumors.

    PubMed

    Padmanabhan, Raghav K; Somasundar, Vinay H; Griffith, Sandra D; Zhu, Jianliang; Samoyedny, Drew; Tan, Kay See; Hu, Jiahao; Liao, Xuejun; Carin, Lawrence; Yoon, Sam S; Flaherty, Keith T; Dipaola, Robert S; Heitjan, Daniel F; Lal, Priti; Feldman, Michael D; Roysam, Badrinath; Lee, William M F

    2014-01-01

    Currently, no available pathological or molecular measures of tumor angiogenesis predict response to antiangiogenic therapies used in clinical practice. Recognizing that tumor endothelial cells (EC) and EC activation and survival signaling are the direct targets of these therapies, we sought to develop an automated platform for quantifying activity of critical signaling pathways and other biological events in EC of patient tumors by histopathology. Computer image analysis of EC in highly heterogeneous human tumors by a statistical classifier trained using examples selected by human experts performed poorly due to subjectivity and selection bias. We hypothesized that the analysis can be optimized by a more active process to aid experts in identifying informative training examples. To test this hypothesis, we incorporated a novel active learning (AL) algorithm into FARSIGHT image analysis software that aids the expert by seeking out informative examples for the operator to label. The resulting FARSIGHT-AL system identified EC with specificity and sensitivity consistently greater than 0.9 and outperformed traditional supervised classification algorithms. The system modeled individual operator preferences and generated reproducible results. Using the results of EC classification, we also quantified proliferation (Ki67) and activity in important signal transduction pathways (MAP kinase, STAT3) in immunostained human clear cell renal cell carcinoma and other tumors. FARSIGHT-AL enables characterization of EC in conventionally preserved human tumors in a more automated process suitable for testing and validating in clinical trials. The results of our study support a unique opportunity for quantifying angiogenesis in a manner that can now be tested for its ability to identify novel predictive and response biomarkers.

  6. CRITTERS! A Realistic Simulation for Teaching Evolutionary Biology

    ERIC Educational Resources Information Center

    Latham, Luke G., II; Scully, Erik P.

    2008-01-01

    Evolutionary processes can be studied in nature and in the laboratory, but time and financial constraints result in few opportunities for undergraduate and high school students to explore the agents of genetic change in populations. One alternative to time consuming and expensive teaching laboratories is the use of computer simulations. We…

  7. Speeding Up Ecological and Evolutionary Computations in R; Essentials of High Performance Computing for Biologists

    PubMed Central

    Visser, Marco D.; McMahon, Sean M.; Merow, Cory; Dixon, Philip M.; Record, Sydne; Jongejans, Eelke

    2015-01-01

    Computation has become a critical component of research in biology. A risk has emerged that computational and programming challenges may limit research scope, depth, and quality. We review various solutions to common computational efficiency problems in ecological and evolutionary research. Our review pulls together material that is currently scattered across many sources and emphasizes those techniques that are especially effective for typical ecological and environmental problems. We demonstrate how straightforward it can be to write efficient code and implement techniques such as profiling or parallel computing. We supply a newly developed R package (aprof) that helps to identify computational bottlenecks in R code and determine whether optimization can be effective. Our review is complemented by a practical set of examples and detailed Supporting Information material (S1–S3 Texts) that demonstrate large improvements in computational speed (ranging from 10.5 times to 14,000 times faster). By improving computational efficiency, biologists can feasibly solve more complex tasks, ask more ambitious questions, and include more sophisticated analyses in their research. PMID:25811842

  8. An Adaptive Evolutionary Algorithm for Traveling Salesman Problem with Precedence Constraints

    PubMed Central

    Sung, Jinmo; Jeong, Bongju

    2014-01-01

    Traveling sales man problem with precedence constraints is one of the most notorious problems in terms of the efficiency of its solution approach, even though it has very wide range of industrial applications. We propose a new evolutionary algorithm to efficiently obtain good solutions by improving the search process. Our genetic operators guarantee the feasibility of solutions over the generations of population, which significantly improves the computational efficiency even when it is combined with our flexible adaptive searching strategy. The efficiency of the algorithm is investigated by computational experiments. PMID:24701158

  9. An adaptive evolutionary algorithm for traveling salesman problem with precedence constraints.

    PubMed

    Sung, Jinmo; Jeong, Bongju

    2014-01-01

    Traveling sales man problem with precedence constraints is one of the most notorious problems in terms of the efficiency of its solution approach, even though it has very wide range of industrial applications. We propose a new evolutionary algorithm to efficiently obtain good solutions by improving the search process. Our genetic operators guarantee the feasibility of solutions over the generations of population, which significantly improves the computational efficiency even when it is combined with our flexible adaptive searching strategy. The efficiency of the algorithm is investigated by computational experiments.

  10. Cancer Evolution: Mathematical Models and Computational Inference

    PubMed Central

    Beerenwinkel, Niko; Schwarz, Roland F.; Gerstung, Moritz; Markowetz, Florian

    2015-01-01

    Cancer is a somatic evolutionary process characterized by the accumulation of mutations, which contribute to tumor growth, clinical progression, immune escape, and drug resistance development. Evolutionary theory can be used to analyze the dynamics of tumor cell populations and to make inference about the evolutionary history of a tumor from molecular data. We review recent approaches to modeling the evolution of cancer, including population dynamics models of tumor initiation and progression, phylogenetic methods to model the evolutionary relationship between tumor subclones, and probabilistic graphical models to describe dependencies among mutations. Evolutionary modeling helps to understand how tumors arise and will also play an increasingly important prognostic role in predicting disease progression and the outcome of medical interventions, such as targeted therapy. PMID:25293804

  11. Mean-Potential Law in Evolutionary Games

    NASA Astrophysics Data System (ADS)

    Nałecz-Jawecki, Paweł; Miekisz, Jacek

    2018-01-01

    The Letter presents a novel way to connect random walks, stochastic differential equations, and evolutionary game theory. We introduce a new concept of a potential function for discrete-space stochastic systems. It is based on a correspondence between one-dimensional stochastic differential equations and random walks, which may be exact not only in the continuous limit but also in finite-state spaces. Our method is useful for computation of fixation probabilities in discrete stochastic dynamical systems with two absorbing states. We apply it to evolutionary games, formulating two simple and intuitive criteria for evolutionary stability of pure Nash equilibria in finite populations. In particular, we show that the 1 /3 law of evolutionary games, introduced by Nowak et al. [Nature, 2004], follows from a more general mean-potential law.

  12. Open Issues in Evolutionary Robotics.

    PubMed

    Silva, Fernando; Duarte, Miguel; Correia, Luís; Oliveira, Sancho Moura; Christensen, Anders Lyhne

    2016-01-01

    One of the long-term goals in evolutionary robotics is to be able to automatically synthesize controllers for real autonomous robots based only on a task specification. While a number of studies have shown the applicability of evolutionary robotics techniques for the synthesis of behavioral control, researchers have consistently been faced with a number of issues preventing the widespread adoption of evolutionary robotics for engineering purposes. In this article, we review and discuss the open issues in evolutionary robotics. First, we analyze the benefits and challenges of simulation-based evolution and subsequent deployment of controllers versus evolution on real robotic hardware. Second, we discuss specific evolutionary computation issues that have plagued evolutionary robotics: (1) the bootstrap problem, (2) deception, and (3) the role of genomic encoding and genotype-phenotype mapping in the evolution of controllers for complex tasks. Finally, we address the absence of standard research practices in the field. We also discuss promising avenues of research. Our underlying motivation is the reduction of the current gap between evolutionary robotics and mainstream robotics, and the establishment of evolutionary robotics as a canonical approach for the engineering of autonomous robots.

  13. Toward a unifying framework for evolutionary processes.

    PubMed

    Paixão, Tiago; Badkobeh, Golnaz; Barton, Nick; Çörüş, Doğan; Dang, Duc-Cuong; Friedrich, Tobias; Lehre, Per Kristian; Sudholt, Dirk; Sutton, Andrew M; Trubenová, Barbora

    2015-10-21

    The theory of population genetics and evolutionary computation have been evolving separately for nearly 30 years. Many results have been independently obtained in both fields and many others are unique to its respective field. We aim to bridge this gap by developing a unifying framework for evolutionary processes that allows both evolutionary algorithms and population genetics models to be cast in the same formal framework. The framework we present here decomposes the evolutionary process into its several components in order to facilitate the identification of similarities between different models. In particular, we propose a classification of evolutionary operators based on the defining properties of the different components. We cast several commonly used operators from both fields into this common framework. Using this, we map different evolutionary and genetic algorithms to different evolutionary regimes and identify candidates with the most potential for the translation of results between the fields. This provides a unified description of evolutionary processes and represents a stepping stone towards new tools and results to both fields. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Galaxy CloudMan: delivering cloud compute clusters

    PubMed Central

    2010-01-01

    Background Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is “cloud computing”, which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate “as is” use by experimental biologists. Results We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon’s EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. Conclusions The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge. PMID:21210983

  15. Highly parameterized model calibration with cloud computing: an example of regional flow model calibration in northeast Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Hayley, Kevin; Schumacher, J.; MacMillan, G. J.; Boutin, L. C.

    2014-05-01

    Expanding groundwater datasets collected by automated sensors, and improved groundwater databases, have caused a rapid increase in calibration data available for groundwater modeling projects. Improved methods of subsurface characterization have increased the need for model complexity to represent geological and hydrogeological interpretations. The larger calibration datasets and the need for meaningful predictive uncertainty analysis have both increased the degree of parameterization necessary during model calibration. Due to these competing demands, modern groundwater modeling efforts require a massive degree of parallelization in order to remain computationally tractable. A methodology for the calibration of highly parameterized, computationally expensive models using the Amazon EC2 cloud computing service is presented. The calibration of a regional-scale model of groundwater flow in Alberta, Canada, is provided as an example. The model covers a 30,865-km2 domain and includes 28 hydrostratigraphic units. Aquifer properties were calibrated to more than 1,500 static hydraulic head measurements and 10 years of measurements during industrial groundwater use. Three regionally extensive aquifers were parameterized (with spatially variable hydraulic conductivity fields), as was the aerial recharge boundary condition, leading to 450 adjustable parameters in total. The PEST-based model calibration was parallelized on up to 250 computing nodes located on Amazon's EC2 servers.

  16. Characterising Complex Enzyme Reaction Data

    PubMed Central

    Rahman, Syed Asad; Thornton, Janet M.

    2016-01-01

    The relationship between enzyme-catalysed reactions and the Enzyme Commission (EC) number, the widely accepted classification scheme used to characterise enzyme activity, is complex and with the rapid increase in our knowledge of the reactions catalysed by enzymes needs revisiting. We present a manual and computational analysis to investigate this complexity and found that almost one-third of all known EC numbers are linked to more than one reaction in the secondary reaction databases (e.g., KEGG). Although this complexity is often resolved by defining generic, alternative and partial reactions, we have also found individual EC numbers with more than one reaction catalysing different types of bond changes. This analysis adds a new dimension to our understanding of enzyme function and might be useful for the accurate annotation of the function of enzymes and to study the changes in enzyme function during evolution. PMID:26840640

  17. Anesthetic Binding in a Pentameric Ligand-Gated Ion Channel: GLIC

    PubMed Central

    Chen, Qiang; Cheng, Mary Hongying; Xu, Yan; Tang, Pei

    2010-01-01

    Cys-loop receptors are molecular targets of general anesthetics, but the knowledge of anesthetic binding to these proteins remains limited. Here we investigate anesthetic binding to the bacterial Gloeobacter violaceus pentameric ligand-gated ion channel (GLIC), a structural homolog of cys-loop receptors, using an experimental and computational hybrid approach. Tryptophan fluorescence quenching experiments showed halothane and thiopental binding at three tryptophan-associated sites in the extracellular (EC) domain, transmembrane (TM) domain, and EC-TM interface of GLIC. An additional binding site at the EC-TM interface was predicted by docking analysis and validated by quenching experiments on the N200W GLIC mutant. The binding affinities (KD) of 2.3 ± 0.1 mM and 0.10 ± 0.01 mM were derived from the fluorescence quenching data of halothane and thiopental, respectively. Docking these anesthetics to the original GLIC crystal structure and the structures relaxed by molecular dynamics simulations revealed intrasubunit sites for most halothane binding and intersubunit sites for thiopental binding. Tryptophans were within reach of both intra- and intersubunit binding sites. Multiple molecular dynamics simulations on GLIC in the presence of halothane at different sites suggested that anesthetic binding at the EC-TM interface disrupted the critical interactions for channel gating, altered motion of the TM23 linker, and destabilized the open-channel conformation that can lead to inhibition of GLIC channel current. The study has not only provided insights into anesthetic binding in GLIC, but also demonstrated a successful fusion of experiments and computations for understanding anesthetic actions in complex proteins. PMID:20858424

  18. Comparison of the triple-point temperatures of {sup 20}Ne, {sup 22}Ne and normal Ne

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakano, T.; Tamura, O.; Nagao, K.

    2013-09-11

    At the National Metrology Institute of Japan (NMIJ), the triple points of {sup 20}Ne and {sup 22}Ne were realized using modular sealed cells, Ec3Ne20 and Ec8Ne22, made by the Istituto Nazionale di Ricerca Metrologica (INRiM) in Italy. The difference of the triple-point temperatures of {sup 20}Ne and {sup 22}Ne was estimated by using the sub-range of standard platinum resistance thermometers (SPRTs) calibrated by NMIJ on the International Temperature Scale of 1990 (ITS-90). The melting curves obtained with the Ec3Ne20 and Ec8Ne22 cells show narrow widths (0.1 mK) over a wide range of the inverse of the melted fraction (1/F) frommore » 1/F=1 to 1/F=10. The liquidus point T{sub tp} estimated by the melting curves from F∼0.5 to F∼0.85 using the Ec8Ne22 is 0.146 29 (4) K higher than that using the Ec3Ne20 cell, which is in good agreement with that observed by INRiM using the same cells. After correction of the effect of impurities and other isotopes for Ec3Ne20 and Ec8Ne22 cells, the difference of T{sub tp} between pure {sup 20}Ne and pure {sup 22}Ne is estimated to be 0.146 61 (4) K, which is consistent with the recent results reported elsewhere. The sub-ranges of SPRTs computed by using the triple point of {sup 20}Ne or {sup 22}Ne realized by the Ec3Ne20 cell or the Ec8Ne22 cell in place of the triple point of Ne for the defining fixed point of the ITS-90 are in good agreement with those realized on the basis of the ITS-90 at NMIJ within 0.03 mK, which is much smaller than the non-uniqueness and the sub-range inconsistency of SPRTs.« less

  19. Combined Quantum Chemical/Raman Spectroscopic Analyses of Li+ Cation Solvation: Cyclic Carbonate Solvents - Ethylene Carbonate and Propylene Earbonate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, Joshua L.; Borodin, Oleg; Seo, D. M.

    2014-12-01

    Combined computational/Raman spectroscopic analyses of ethylene carbonate (EC) and propylene carbonate (PC) solvation interactions with lithium salts are reported. It is proposed that previously reported Raman analyses of (EC)n-LiX mixtures have utilized faulty assumptions. In the present studies, density functional theory (DFT) calculations have provided corrections in terms of both the scaling factors for the solvent's Raman band intensity variations and information about band overlap. By accounting for these factors, the solvation numbers obtained from two different EC solvent bands are in excellent agreement with one another. The same analysis for PC, however, was found to be quite challenging. Commerciallymore » available PC is a racemic mixture of (S)- and (R)-PC isomers. Based upon the quantum chemistry calculations, each of these solvent isomers may exist as multiple conformers due to a low energy barrier for ring inversion, making deconvolution of the Raman bands daunting and inherently prone to significant error. Thus, Raman spectroscopy is able to accurately determine the extent of the EC...Li+ cation solvation interactions using the provided methodology, but a similar analysis of PC...Li+ cation solvation results in a significant underestimation of the actual solvation numbers.« less

  20. Computation of direct and inverse mutations with the SEGM web server (Stochastic Evolution of Genetic Motifs): an application to splice sites of human genome introns.

    PubMed

    Benard, Emmanuel; Michel, Christian J

    2009-08-01

    We present here the SEGM web server (Stochastic Evolution of Genetic Motifs) in order to study the evolution of genetic motifs both in the direct evolutionary sense (past-present) and in the inverse evolutionary sense (present-past). The genetic motifs studied can be nucleotides, dinucleotides and trinucleotides. As an example of an application of SEGM and to understand its functionalities, we give an analysis of inverse mutations of splice sites of human genome introns. SEGM is freely accessible at http://lsiit-bioinfo.u-strasbg.fr:8080/webMathematica/SEGM/SEGM.html directly or by the web site http://dpt-info.u-strasbg.fr/~michel/. To our knowledge, this SEGM web server is to date the only computational biology software in this evolutionary approach.

  1. Detection of CCNE1/URI (19q12) amplification by in situ hybridisation is common in high grade and type II endometrial cancer

    PubMed Central

    Noske, Aurelia; Brandt, Simone; Valtcheva, Nadejda; Wagner, Ulrich; Zhong, Qing; Bellini, Elisa; Fink, Daniel; Obermann, Ellen C.; Moch, Holger; Wild, Peter J.

    2017-01-01

    One TCGA subgroup of endometrial cancer (EC) is characterised by extensive genomic DNA copy number alterations. CCNE1 located at 19q12 is frequently amplified in EC and a target for anti-cancer therapy. The relevance of URI, also located at 19q12, is unknown. To evaluate the prevalence of 19q12 (CCNE1/URI) in EC, we investigated different histologic types by in situ hybridisation (ISH) and copy number assay. We applied a previously established 19q12 ISH for the detection of CCNE1/URI copy numbers in EC (n = 270) using conventional bright field microscopy. In a subset (n = 21), 19q12 amplification status was validated by OncoScan assay. Manual ISH was controlled by a recently developed computational ISHProfiler algorithm. Associations of 19q12 status with Cyclin E1, URI and p53 expression, and clinico-pathological parameters were tested. Amplification of 19q12 (CCNE1/URI) was found in 10.4% (28/270) and was significantly associated with type II EC (high grade and non-endometrioid; p < 0.0001), advanced FIGO stage (p = 0.001), high Cyclin E1 expression (p = 0.008) and aberrant p53 expression (p = 0.04). 19q12 ISH data were confirmed by OncoScan and computational ISHProfiler techniques. The 19q12 in situ hybridisation is a feasible and robust biomarker assay in molecular pathology. Amplification of CCNE1/URI predominantly occurred in type II endometrial cancer. Prospective clinical trials are warranted to assess the utility of combined 19q12 amplification and Cyclin E1/URI protein expression analysis for the prediction of therapeutic response to chemotherapy and/or cyclin-dependent kinase inhibitors in patients with endometrial cancer. PMID:27582547

  2. Biology Needs Evolutionary Software Tools: Let’s Build Them Right

    PubMed Central

    Team, Galaxy; Goecks, Jeremy; Taylor, James

    2018-01-01

    Abstract Research in population genetics and evolutionary biology has always provided a computational backbone for life sciences as a whole. Today evolutionary and population biology reasoning are essential for interpretation of large complex datasets that are characteristic of all domains of today’s life sciences ranging from cancer biology to microbial ecology. This situation makes algorithms and software tools developed by our community more important than ever before. This means that we, developers of software tool for molecular evolutionary analyses, now have a shared responsibility to make these tools accessible using modern technological developments as well as provide adequate documentation and training. PMID:29688462

  3. Li + solvation and kinetics of Li +–BF 4 -/PF 6 - ion pairs in ethylene carbonate. A molecular dynamics study with classical rate theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Tsun-Mei; Dang, Liem X.

    Using our polarizable force-field models and employing classical rate theories of chemical reactions, we examine in this paper the ethylene carbonate (EC) exchange process between the first and second solvation shells around Li + and the dissociation kinetics of ion pairs Li +–[BF 4] and Li +–[PF 6] in this solvent. We calculate the exchange rates using transition state theory and correct them with transmission coefficients computed by the reactive flux, Impey, Madden, and McDonald approaches, and Grote-Hynes theory. We found that the residence times of EC around Li + ions varied from 60 to 450 ps, depending on themore » correction method used. We found that the relaxation times changed significantly from Li +–[BF 4] to Li +–[PF 6] ion pairs in EC. Finally, our results also show that, in addition to affecting the free energy of dissociation in EC, the anion type also significantly influences the dissociation kinetics of ion pairing.« less

  4. Li + solvation and kinetics of Li +–BF 4 -/PF 6 - ion pairs in ethylene carbonate. A molecular dynamics study with classical rate theories

    DOE PAGES

    Chang, Tsun-Mei; Dang, Liem X.

    2017-07-19

    Using our polarizable force-field models and employing classical rate theories of chemical reactions, we examine in this paper the ethylene carbonate (EC) exchange process between the first and second solvation shells around Li + and the dissociation kinetics of ion pairs Li +–[BF 4] and Li +–[PF 6] in this solvent. We calculate the exchange rates using transition state theory and correct them with transmission coefficients computed by the reactive flux, Impey, Madden, and McDonald approaches, and Grote-Hynes theory. We found that the residence times of EC around Li + ions varied from 60 to 450 ps, depending on themore » correction method used. We found that the relaxation times changed significantly from Li +–[BF 4] to Li +–[PF 6] ion pairs in EC. Finally, our results also show that, in addition to affecting the free energy of dissociation in EC, the anion type also significantly influences the dissociation kinetics of ion pairing.« less

  5. Evolving Better Cars: Teaching Evolution by Natural Selection with a Digital Inquiry Activity

    ERIC Educational Resources Information Center

    Royer, Anne M.; Schultheis, Elizabeth H.

    2014-01-01

    Evolutionary experiments are usually difficult to perform in the classroom because of the large sizes and long timescales of experiments testing evolutionary hypotheses. Computer applications give students a window to observe evolution in action, allowing them to gain comfort with the process of natural selection and facilitating inquiry…

  6. Memetic Algorithms, Domain Knowledge, and Financial Investing

    ERIC Educational Resources Information Center

    Du, Jie

    2012-01-01

    While the question of how to use human knowledge to guide evolutionary search is long-recognized, much remains to be done to answer this question adequately. This dissertation aims to further answer this question by exploring the role of domain knowledge in evolutionary computation as applied to real-world, complex problems, such as financial…

  7. Bipartite graphs as models of population structures in evolutionary multiplayer games.

    PubMed

    Peña, Jorge; Rochat, Yannick

    2012-01-01

    By combining evolutionary game theory and graph theory, "games on graphs" study the evolutionary dynamics of frequency-dependent selection in population structures modeled as geographical or social networks. Networks are usually represented by means of unipartite graphs, and social interactions by two-person games such as the famous prisoner's dilemma. Unipartite graphs have also been used for modeling interactions going beyond pairwise interactions. In this paper, we argue that bipartite graphs are a better alternative to unipartite graphs for describing population structures in evolutionary multiplayer games. To illustrate this point, we make use of bipartite graphs to investigate, by means of computer simulations, the evolution of cooperation under the conventional and the distributed N-person prisoner's dilemma. We show that several implicit assumptions arising from the standard approach based on unipartite graphs (such as the definition of replacement neighborhoods, the intertwining of individual and group diversity, and the large overlap of interaction neighborhoods) can have a large impact on the resulting evolutionary dynamics. Our work provides a clear example of the importance of construction procedures in games on graphs, of the suitability of bigraphs and hypergraphs for computational modeling, and of the importance of concepts from social network analysis such as centrality, centralization and bipartite clustering for the understanding of dynamical processes occurring on networked population structures.

  8. Resistance and relatedness on an evolutionary graph

    PubMed Central

    Maciejewski, Wes

    2012-01-01

    When investigating evolution in structured populations, it is often convenient to consider the population as an evolutionary graph—individuals as nodes, and whom they may act with as edges. There has, in recent years, been a surge of interest in evolutionary graphs, especially in the study of the evolution of social behaviours. An inclusive fitness framework is best suited for this type of study. A central requirement for an inclusive fitness analysis is an expression for the genetic similarity between individuals residing on the graph. This has been a major hindrance for work in this area as highly technical mathematics are often required. Here, I derive a result that links genetic relatedness between haploid individuals on an evolutionary graph to the resistance between vertices on a corresponding electrical network. An example that demonstrates the potential computational advantage of this result over contemporary approaches is provided. This result offers more, however, to the study of population genetics than strictly computationally efficient methods. By establishing a link between gene transfer and electric circuit theory, conceptualizations of the latter can enhance understanding of the former. PMID:21849384

  9. An Evolutionary Algorithm for Fast Intensity Based Image Matching Between Optical and SAR Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Fischer, Peter; Schuegraf, Philipp; Merkle, Nina; Storch, Tobias

    2018-04-01

    This paper presents a hybrid evolutionary algorithm for fast intensity based matching between satellite imagery from SAR and very high-resolution (VHR) optical sensor systems. The precise and accurate co-registration of image time series and images of different sensors is a key task in multi-sensor image processing scenarios. The necessary preprocessing step of image matching and tie-point detection is divided into a search problem and a similarity measurement. Within this paper we evaluate the use of an evolutionary search strategy for establishing the spatial correspondence between satellite imagery of optical and radar sensors. The aim of the proposed algorithm is to decrease the computational costs during the search process by formulating the search as an optimization problem. Based upon the canonical evolutionary algorithm, the proposed algorithm is adapted for SAR/optical imagery intensity based matching. Extensions are drawn using techniques like hybridization (e.g. local search) and others to lower the number of objective function calls and refine the result. The algorithm significantely decreases the computational costs whilst finding the optimal solution in a reliable way.

  10. An evolutionary firefly algorithm for the estimation of nonlinear biological model parameters.

    PubMed

    Abdullah, Afnizanfaizal; Deris, Safaai; Anwar, Sohail; Arjunan, Satya N V

    2013-01-01

    The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test.

  11. An Evolutionary Firefly Algorithm for the Estimation of Nonlinear Biological Model Parameters

    PubMed Central

    Abdullah, Afnizanfaizal; Deris, Safaai; Anwar, Sohail; Arjunan, Satya N. V.

    2013-01-01

    The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test. PMID:23469172

  12. RF Magnetron Sputtering Deposited W/Ti Thin Film For Smart Window Applications

    NASA Astrophysics Data System (ADS)

    Oksuz, Lutfi; Kiristi, Melek; Bozduman, Ferhat; Uygun Oksuz, Aysegul

    2014-10-01

    Electrochromic (EC) devices can change reversible and persistent their optical properties in the visible region (400-800 nm) upon charge insertion/extraction according to the applied voltage. A complementary type EC is a device containing two electrochromic layers, one of which is anodically colored such as vanadium oxide (V2 O5) while the other cathodically colored such as tungsten oxide (WO3) which is separated by an ionic conduction layer (electrolyte). The use of a solid electrolyte such as Nafion eliminates the need for containment of the liquid electrolyte, which simplifies the cell design, as well as improves safety and durability. In this work, the EC device was fabricated on a ITO/glass slide. The WO3-TiO2 thin film was deposited by reactive RF magnetron sputtering using a 2-in W/Ti (9:1%wt) target with purity of 99.9% in a mixture gas of argon and oxygen. As a counter electrode layer, V2O5 film was deposited on an ITO/glass substrate using V2O3 target with the same conditions of reactive RF magnetron sputtering. Modified Nafion was used as an electrolyte to complete EC device. The transmittance spectra of the complementary EC device was measured by optical spectrophotometry when a voltage of +/-3 V was applied to the EC device by computer controlled system. The surface morphology of the films was characterized by scanning electron microscopy (SEM) and atomic force microscopy (AFM) (Fig. 2). The cyclic voltammetry (CV) for EC device was performed by sweeping the potential between +/-3 V at a scan rate of 50 mV/s.

  13. The dynamic programming high-order Dynamic Bayesian Networks learning for identifying effective connectivity in human brain from fMRI.

    PubMed

    Dang, Shilpa; Chaudhury, Santanu; Lall, Brejesh; Roy, Prasun Kumar

    2017-06-15

    Determination of effective connectivity (EC) among brain regions using fMRI is helpful in understanding the underlying neural mechanisms. Dynamic Bayesian Networks (DBNs) are an appropriate class of probabilistic graphical temporal-models that have been used in past to model EC from fMRI, specifically order-one. High-order DBNs (HO-DBNs) have still not been explored for fMRI data. A fundamental problem faced in the structure-learning of HO-DBN is high computational-burden and low accuracy by the existing heuristic search techniques used for EC detection from fMRI. In this paper, we propose using dynamic programming (DP) principle along with integration of properties of scoring-function in a way to reduce search space for structure-learning of HO-DBNs and finally, for identifying EC from fMRI which has not been done yet to the best of our knowledge. The proposed exact search-&-score learning approach HO-DBN-DP is an extension of the technique which was originally devised for learning a BN's structure from static data (Singh and Moore, 2005). The effectiveness in structure-learning is shown on synthetic fMRI dataset. The algorithm reaches globally-optimal solution in appreciably reduced time-complexity than the static counterpart due to integration of properties. The proof of optimality is provided. The results demonstrate that HO-DBN-DP is comparably more accurate and faster than currently used structure-learning algorithms used for identifying EC from fMRI. The real data EC from HO-DBN-DP shows consistency with previous literature than the classical Granger Causality method. Hence, the DP algorithm can be employed for reliable EC estimates from experimental fMRI data. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. EvolQG - An R package for evolutionary quantitative genetics

    PubMed Central

    Melo, Diogo; Garcia, Guilherme; Hubbe, Alex; Assis, Ana Paula; Marroig, Gabriel

    2016-01-01

    We present an open source package for performing evolutionary quantitative genetics analyses in the R environment for statistical computing. Evolutionary theory shows that evolution depends critically on the available variation in a given population. When dealing with many quantitative traits this variation is expressed in the form of a covariance matrix, particularly the additive genetic covariance matrix or sometimes the phenotypic matrix, when the genetic matrix is unavailable and there is evidence the phenotypic matrix is sufficiently similar to the genetic matrix. Given this mathematical representation of available variation, the \\textbf{EvolQG} package provides functions for calculation of relevant evolutionary statistics; estimation of sampling error; corrections for this error; matrix comparison via correlations, distances and matrix decomposition; analysis of modularity patterns; and functions for testing evolutionary hypotheses on taxa diversification. PMID:27785352

  15. Mean-Potential Law in Evolutionary Games.

    PubMed

    Nałęcz-Jawecki, Paweł; Miękisz, Jacek

    2018-01-12

    The Letter presents a novel way to connect random walks, stochastic differential equations, and evolutionary game theory. We introduce a new concept of a potential function for discrete-space stochastic systems. It is based on a correspondence between one-dimensional stochastic differential equations and random walks, which may be exact not only in the continuous limit but also in finite-state spaces. Our method is useful for computation of fixation probabilities in discrete stochastic dynamical systems with two absorbing states. We apply it to evolutionary games, formulating two simple and intuitive criteria for evolutionary stability of pure Nash equilibria in finite populations. In particular, we show that the 1/3 law of evolutionary games, introduced by Nowak et al. [Nature, 2004], follows from a more general mean-potential law.

  16. IOS and ECS line coupling calculation for the CO-He system - Influence on the vibration-rotation band shapes

    NASA Technical Reports Server (NTRS)

    Boissoles, J.; Boulet, C.; Robert, D.; Green, S.

    1987-01-01

    Line coupling coefficients resulting from rotational excitation of CO perturbed by He are computed within the infinite order sudden approximation (IOSA) and within the energy corrected sudden approximation (ECSA). The influence of this line coupling on the 1-0 CO-He vibration-rotation band shape is then computed for the case of weakly overlapping lines in the 292-78 K temperature range. The IOS and ECS results differ only at 78 K by a weak amount at high frequencies. Comparison with an additive superposition of Lorentzian lines shows strong modifications in the troughs between the lines. These calculated modifications are in excellent quantitative agreement with recent experimental data for all the temperatures considered. The applicability of previous approaches to CO-He system, based on either the strong collision model or exponential energy gap law, is also discussed.

  17. Embodiment and Human Development.

    PubMed

    Marshall, Peter J

    2016-12-01

    We are recognizing increasingly that the study of cognitive, social, and emotional processes must account for their embodiment in living, acting beings. The related field of embodied cognition (EC) has coalesced around dissatisfaction with the lack of attention to the body in cognitive science. For developmental scientists, the emphasis in the literature on adult EC on the role of the body in cognition may not seem particularly novel, given that bodily action was central to Piaget's theory of cognitive development. However, as the influence of the Piagetian account waned, developmental notions of embodiment were shelved in favor of mechanical computational approaches. In this article, I argue that by reconsidering embodiment, we can address a key issue with computational accounts: how meaning is constructed by the developing person. I also suggest that the process-relational approach to developmental systems can provide a system of concepts for framing a fully embodied, integrative developmental science.

  18. Embodiment and Human Development

    PubMed Central

    Marshall, Peter J.

    2016-01-01

    We are recognizing increasingly that the study of cognitive, social, and emotional processes must account for their embodiment in living, acting beings. The related field of embodied cognition (EC) has coalesced around dissatisfaction with the lack of attention to the body in cognitive science. For developmental scientists, the emphasis in the literature on adult EC on the role of the body in cognition may not seem particularly novel, given that bodily action was central to Piaget’s theory of cognitive development. However, as the influence of the Piagetian account waned, developmental notions of embodiment were shelved in favor of mechanical computational approaches. In this article, I argue that by reconsidering embodiment, we can address a key issue with computational accounts: how meaning is constructed by the developing person. I also suggest that the process-relational approach to developmental systems can provide a system of concepts for framing a fully embodied, integrative developmental science. PMID:27833651

  19. Evolutionary versatility of eukaryotic protein domains revealed by their bigram networks

    PubMed Central

    2011-01-01

    Background Protein domains are globular structures of independently folded polypeptides that exert catalytic or binding activities. Their sequences are recognized as evolutionary units that, through genome recombination, constitute protein repertoires of linkage patterns. Via mutations, domains acquire modified functions that contribute to the fitness of cells and organisms. Recent studies have addressed the evolutionary selection that may have shaped the functions of individual domains and the emergence of particular domain combinations, which led to new cellular functions in multi-cellular animals. This study focuses on modeling domain linkage globally and investigates evolutionary implications that may be revealed by novel computational analysis. Results A survey of 77 completely sequenced eukaryotic genomes implies a potential hierarchical and modular organization of biological functions in most living organisms. Domains in a genome or multiple genomes are modeled as a network of hetero-duplex covalent linkages, termed bigrams. A novel computational technique is introduced to decompose such networks, whereby the notion of domain "networking versatility" is derived and measured. The most and least "versatile" domains (termed "core domains" and "peripheral domains" respectively) are examined both computationally via sequence conservation measures and experimentally using selected domains. Our study suggests that such a versatility measure extracted from the bigram networks correlates with the adaptivity of domains during evolution, where the network core domains are highly adaptive, significantly contrasting the network peripheral domains. Conclusions Domain recombination has played a major part in the evolution of eukaryotes attributing to genome complexity. From a system point of view, as the results of selection and constant refinement, networks of domain linkage are structured in a hierarchical modular fashion. Domains with high degree of networking versatility appear to be evolutionary adaptive, potentially through functional innovations. Domain bigram networks are informative as a model of biological functions. The networking versatility indices extracted from such networks for individual domains reflect the strength of evolutionary selection that the domains have experienced. PMID:21849086

  20. Evolutionary versatility of eukaryotic protein domains revealed by their bigram networks.

    PubMed

    Xie, Xueying; Jin, Jing; Mao, Yongyi

    2011-08-18

    Protein domains are globular structures of independently folded polypeptides that exert catalytic or binding activities. Their sequences are recognized as evolutionary units that, through genome recombination, constitute protein repertoires of linkage patterns. Via mutations, domains acquire modified functions that contribute to the fitness of cells and organisms. Recent studies have addressed the evolutionary selection that may have shaped the functions of individual domains and the emergence of particular domain combinations, which led to new cellular functions in multi-cellular animals. This study focuses on modeling domain linkage globally and investigates evolutionary implications that may be revealed by novel computational analysis. A survey of 77 completely sequenced eukaryotic genomes implies a potential hierarchical and modular organization of biological functions in most living organisms. Domains in a genome or multiple genomes are modeled as a network of hetero-duplex covalent linkages, termed bigrams. A novel computational technique is introduced to decompose such networks, whereby the notion of domain "networking versatility" is derived and measured. The most and least "versatile" domains (termed "core domains" and "peripheral domains" respectively) are examined both computationally via sequence conservation measures and experimentally using selected domains. Our study suggests that such a versatility measure extracted from the bigram networks correlates with the adaptivity of domains during evolution, where the network core domains are highly adaptive, significantly contrasting the network peripheral domains. Domain recombination has played a major part in the evolution of eukaryotes attributing to genome complexity. From a system point of view, as the results of selection and constant refinement, networks of domain linkage are structured in a hierarchical modular fashion. Domains with high degree of networking versatility appear to be evolutionary adaptive, potentially through functional innovations. Domain bigram networks are informative as a model of biological functions. The networking versatility indices extracted from such networks for individual domains reflect the strength of evolutionary selection that the domains have experienced.

  1. Dielectric relaxation of ethylene carbonate and propylene carbonate from molecular dynamics simulations

    DOE PAGES

    Chaudhari, Mangesh I.; You, Xinli; Pratt, Lawrence R.; ...

    2015-11-24

    Ethylene carbonate (EC) and propylene carbonate (PC) are widely used solvents in lithium (Li)-ion batteries and supercapacitors. Ion dissolution and diffusion in those media are correlated with solvent dielectric responses. Here, we use all-atom molecular dynamics simulations of the pure solvents to calculate dielectric constants and relaxation times, and molecular mobilities. The computed results are compared with limited available experiments to assist more exhaustive studies of these important characteristics. As a result, the observed agreement is encouraging and provides guidance for further validation of force-field simulation models for EC and PC solvents.

  2. Spiny Mice Modulate Eumelanin to Pheomelanin Ratio to Achieve Cryptic Coloration in “Evolution Canyon,” Israel

    PubMed Central

    Singaravelan, Natarajan; Pavlicek, Tomas; Beharav, Alex; Wakamatsu, Kazumasa; Ito, Shosuke; Nevo, Eviatar

    2010-01-01

    Background Coat coloration in mammals is an explicit adaptation through natural selection. Camouflaging with the environment is the foremost evolutionary drive in explaining overall coloration. Decades of enquiries on this topic have been limited to repetitive coat color measurements to correlate the morphs with background/habitat blending. This led to an overwhelming endorsement of concealing coloration as a local phenotypic adaptation in animals, primarily rodents to evade predators. However, most such studies overlooked how rodents actually achieve such cryptic coloration. Cryptic coloration could be attained only through optimization between the yellow- to brown-colored “pheomelanin” and gray to black-colored “eumelanin” in the hairs. However, no study has explored this conjecture yet. “Evolution Canyon” (EC) in Israel is a natural microscale laboratory where the relationship between organism and environment can be explored. EC is comprised of an “African” slope (AS), which exhibits a yellow-brownish background habitat, and a “European” slope (ES), exhibiting a dark grayish habitat; both slopes harbor spiny mice (Acomys cahirinus). Here, we examine how hair melanin content of spiny mice living in the opposing slopes of EC evolves toward blending with their respective background habitat. Methodology/Principal Findings We measured hair-melanin (both eumelanin and pheomelanin) contents of 30 spiny mice from the EC using high-performance liquid chromatography (HPLC) that detects specific degradation products of eumelanin and pheomelanin. The melanin pattern of A. cahirinus approximates the background color of the slope on which they dwell. Pheomelanin is slightly (insignificantly) higher in individuals found on the AS to match the brownish background, whereas individuals of the ES had significantly greater eumelanin content to mimic the dark grayish background. This is further substantiated by a significantly higher eumelanin and pheomelanin ratio on the ES than on the AS. Conclusion/Significance It appears that rodents adaptively modulate eumelanin and pheomelanin contents to achieve cryptic coloration in contrasting habitats even at a microscale. PMID:20090935

  3. EarthCube as an information resource marketplace; the GEAR Project conceptual design

    NASA Astrophysics Data System (ADS)

    Richard, S. M.; Zaslavsky, I.; Gupta, A.; Valentine, D.

    2015-12-01

    Geoscience Architecture for Research (GEAR) is approaching EarthCube design as a complex and evolving socio-technical federation of systems. EarthCube is intended to support the science research enterprise, for which there is no centralized command and control, requirements are a moving target, the function and behavior of the system must evolve and adapt as new scientific paradigms emerge, and system participants are conducting research that inherently implies seeking new ways of doing things. EarthCube must address evolving user requirements and enable domain and project systems developed under different management and for different purposes to work together. The EC architecture must focus on creating a technical environment that enables new capabilities by combining existing and newly developed resources in various ways, and encourages development of new resource designs intended for re-use and interoperability. In a sense, instead of a single architecture design, GEAR provides a way to accommodate multiple designs tuned to different tasks. This agile, adaptive, evolutionary software development style is based on a continuously updated portfolio of compatible components that enable new sub-system architecture. System users make decisions about which components to use in this marketplace based on performance, satisfaction, and impact metrics collected continuously to evaluate components, determine priorities, and guide resource allocation decisions by the system governance agency. EC is designed as a federation of independent systems, and although the coordinator of the EC system may be named an enterprise architect, the focus of the role needs to be organizing resources, assessing their readiness for interoperability with the existing EC component inventory, managing dependencies between transient subsystems, mechanisms of stakeholder engagement and inclusion, and negotiation of standard interfaces, rather than actual specification of components. Composition of components will be developed by projects that involve both domain scientists and CI experts for specific research problems. We believe an agile, marketplace type approach is an essential architectural strategy for EarthCube.

  4. Comparison of buried sand ridges and regressive sand ridges on the outer shelf of the East China Sea

    NASA Astrophysics Data System (ADS)

    Wu, Ziyin; Jin, Xianglong; Zhou, Jieqiong; Zhao, Dineng; Shang, Jihong; Li, Shoujun; Cao, Zhenyi; Liang, Yuyang

    2017-06-01

    Based on multi-beam echo soundings and high-resolution single-channel seismic profiles, linear sand ridges in U14 and U2 on the East China Sea (ECS) shelf are identified and compared in detail. Linear sand ridges in U14 are buried sand ridges, which are 90 m below the seafloor. It is presumed that these buried sand ridges belong to the transgressive systems tract (TST) formed 320-200 ka ago and that their top interface is the maximal flooding surface (MFS). Linear sand ridges in U2 are regressive sand ridges. It is presumed that these buried sand ridges belong to the TST of the last glacial maximum (LGM) and that their top interface is the MFS of the LGM. Four sub-stage sand ridges of U2 are discerned from the high-resolution single-channel seismic profile and four strikes of regressive sand ridges are distinguished from the submarine topographic map based on the multi-beam echo soundings. These multi-stage and multi-strike linear sand ridges are the response of, and evidence for, the evolution of submarine topography with respect to sea-level fluctuations since the LGM. Although the difference in the age of formation between U14 and U2 is 200 ka and their sequences are 90 m apart, the general strikes of the sand ridges are similar. This indicates that the basic configuration of tidal waves on the ECS shelf has been stable for the last 200 ka. A basic evolutionary model of the strata of the ECS shelf is proposed, in which sea-level change is the controlling factor. During the sea-level change of about 100 ka, five to six strata are developed and the sand ridges develop in the TST. A similar story of the evolution of paleo-topography on the ECS shelf has been repeated during the last 300 ka.

  5. Deploying Crowd-Sourced Formal Verification Systems in a DoD Network

    DTIC Science & Technology

    2013-09-01

    INTENTIONALLY LEFT BLANK 1 I. INTRODUCTION A. INTRODUCTION In 2014 cyber attacks on critical infrastructure are expected to increase...CSFV systems on the Internet‒‒possibly using cloud infrastructure (Dean, 2013). By using Amazon Compute Cloud (EC2) systems, DARPA will use ordinary...through standard access methods. Those clients could be mobile phones, laptops, netbooks, tablet computers or personal digital assistants (PDAs) (Smoot

  6. Evolutionary computing for the design search and optimization of space vehicle power subsystems

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Klimeck, Gerhard; Hanks, David; Hua, Hook

    2004-01-01

    Evolutionary computing has proven to be a straightforward and robust approach for optimizing a wide range of difficult analysis and design problems. This paper discusses the application of these techniques to an existing space vehicle power subsystem resource and performance analysis simulation in a parallel processing environment. Out preliminary results demonstrate that this approach has the potential to improve the space system trade study process by allowing engineers to statistically weight subsystem goals of mass, cost and performance then automatically size power elements based on anticipated performance of the subsystem rather than on worst-case estimates.

  7. Multiscale Kinetic Modeling Reveals an Ensemble of Cl–/H+ Exchange Pathways in ClC-ec1 Antiporter

    PubMed Central

    2018-01-01

    Despite several years of research, the ion exchange mechanisms in chloride/proton antiporters and many other coupled transporters are not yet understood at the molecular level. Here, we present a novel approach to kinetic modeling and apply it to ion exchange in ClC-ec1. Our multiscale kinetic model is developed by (1) calculating the state-to-state rate coefficients with reactive and polarizable molecular dynamics simulations, (2) optimizing these rates in a global kinetic network, and (3) predicting new electrophysiological results. The model shows that the robust Cl:H exchange ratio (2.2:1) can indeed arise from kinetic coupling without large protein conformational changes, indicating a possible facile evolutionary connection to chloride channels. The E148 amino acid residue is shown to couple chloride and proton transport through protonation-dependent blockage of the central anion binding site and an anion-dependent pKa value, which influences proton transport. The results demonstrate how an ensemble of different exchange pathways, as opposed to a single series of transitions, culminates in the macroscopic observables of the antiporter, such as transport rates, chloride/proton stoichiometry, and pH dependence. PMID:29332400

  8. Comparative analyses of fungicide sensitivity and SSR marker variations indicate a low risk of developing azoxystrobin resistance in Phytophthora infestans

    PubMed Central

    Qin, Chun-Fang; He, Meng-Han; Chen, Feng-Ping; Zhu, Wen; Yang, Li-Na; Wu, E-Jiao; Guo, Zheng-Liang; Shang, Li-Ping; Zhan, Jiasui

    2016-01-01

    Knowledge of the evolution of fungicide resistance is important in securing sustainable disease management in agricultural systems. In this study, we analyzed and compared the spatial distribution of genetic variation in azoxystrobin sensitivity and SSR markers in 140 Phytophthora infestans isolates sampled from seven geographic locations in China. Sensitivity to azoxystrobin and its genetic variation in the pathogen populations was measured by the relative growth rate (RGR) at four fungicide concentrations and determination of the effective concentration for 50% inhibition (EC50). We found that all isolates in the current study were sensitive to azoxystrobin and their EC50 was similar to that detected from a European population about 20 years ago, suggesting the risk of developing azoxystrobin resistance in P. infestans populations is low. Further analyses indicate that reduced genetic variation and high fitness cost in resistant mutations are the likely causes for the low evolutionary likelihood of developing azoxystrobin resistance in the pathogen. We also found a negative correlation between azoxystrobin tolerance in P. infestans populations and the mean annual temperature of collection sites, suggesting that global warming may increase the efficiency of using the fungicide to control the late blight. PMID:26853908

  9. Comparative analyses of fungicide sensitivity and SSR marker variations indicate a low risk of developing azoxystrobin resistance in Phytophthora infestans.

    PubMed

    Qin, Chun-Fang; He, Meng-Han; Chen, Feng-Ping; Zhu, Wen; Yang, Li-Na; Wu, E-Jiao; Guo, Zheng-Liang; Shang, Li-Ping; Zhan, Jiasui

    2016-02-08

    Knowledge of the evolution of fungicide resistance is important in securing sustainable disease management in agricultural systems. In this study, we analyzed and compared the spatial distribution of genetic variation in azoxystrobin sensitivity and SSR markers in 140 Phytophthora infestans isolates sampled from seven geographic locations in China. Sensitivity to azoxystrobin and its genetic variation in the pathogen populations was measured by the relative growth rate (RGR) at four fungicide concentrations and determination of the effective concentration for 50% inhibition (EC50). We found that all isolates in the current study were sensitive to azoxystrobin and their EC50 was similar to that detected from a European population about 20 years ago, suggesting the risk of developing azoxystrobin resistance in P. infestans populations is low. Further analyses indicate that reduced genetic variation and high fitness cost in resistant mutations are the likely causes for the low evolutionary likelihood of developing azoxystrobin resistance in the pathogen. We also found a negative correlation between azoxystrobin tolerance in P. infestans populations and the mean annual temperature of collection sites, suggesting that global warming may increase the efficiency of using the fungicide to control the late blight.

  10. Making Classical Ground State Spin Computing Fault-Tolerant

    DTIC Science & Technology

    2010-06-24

    approaches to perebor (brute-force searches) algorithms,” IEEE Annals of the History of Computing, 6, 384–400 (1984). [24] D. Bacon and S . T. Flammia ...Adiabatic gate teleportation,” Phys. Rev. Lett., 103, 120504 (2009). [25] D. Bacon and S . T. Flammia , “Adiabatic cluster state quantum computing...v1 [ co nd -m at . s ta t- m ec h] 2 2 Ju n 20 10 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the

  11. VEGF-Induced Expression of miR-17–92 Cluster in Endothelial Cells Is Mediated by ERK/ELK1 Activation and Regulates Angiogenesis

    PubMed Central

    Chamorro-Jorganes, Aránzazu; Lee, Monica Y.; Araldi, Elisa; Landskroner-Eiger, Shira; Fernández-Fuertes, Marta; Sahraei, Mahnaz; Quiles del Rey, Maria; van Solingen, Coen; Yu, Jun; Fernández-Hernando, Carlos; Sessa, William C.

    2016-01-01

    Rationale: Several lines of evidence indicate that the regulation of microRNA (miRNA) levels by different stimuli may contribute to the modulation of stimulus-induced responses. The miR-17–92 cluster has been linked to tumor development and angiogenesis, but its role in vascular endothelial growth factor–induced endothelial cell (EC) functions is unclear and its regulation is unknown. Objective: The purpose of this study was to elucidate the mechanism by which VEGF regulates the expression of miR-17–92 cluster in ECs and determine its contribution to the regulation of endothelial angiogenic functions, both in vitro and in vivo. This was done by analyzing the effect of postnatal inactivation of miR-17–92 cluster in the endothelium (miR-17–92 iEC-KO mice) on developmental retinal angiogenesis, VEGF-induced ear angiogenesis, and tumor angiogenesis. Methods and Results: Here, we show that Erk/Elk1 activation on VEGF stimulation of ECs is responsible for Elk-1-mediated transcription activation (chromatin immunoprecipitation analysis) of the miR-17–92 cluster. Furthermore, we demonstrate that VEGF-mediated upregulation of the miR-17–92 cluster in vitro is necessary for EC proliferation and angiogenic sprouting. Finally, we provide genetic evidence that miR-17–92 iEC-KO mice have blunted physiological retinal angiogenesis during development and diminished VEGF-induced ear angiogenesis and tumor angiogenesis. Computational analysis and rescue experiments show that PTEN (phosphatase and tensin homolog) is a target of the miR-17–92 cluster and is a crucial mediator of miR-17-92–induced EC proliferation. However, the angiogenic transcriptional program is reduced when miR-17–92 is inhibited. Conclusions: Taken together, our results indicate that VEGF-induced miR-17–92 cluster expression contributes to the angiogenic switch of ECs and participates in the regulation of angiogenesis. PMID:26472816

  12. Choosing surgical lighting in the LED era.

    PubMed

    Knulst, Arjan J; Stassen, Laurents P S; Grimbergen, Cornelis A; Dankelman, Jenny

    2009-12-01

    The aim of this study is to evaluate the illumination characteristics of LED lights objectively to ease the selection of surgical lighting. The illuminance distributions of 5 main and 4 auxiliary lights were measured in 8 clinically relevant scenarios. For each light and scenario, the maximum illuminance E(c) (klux) and the size of the light field d(10) (mm) were computed. The results showed: that large variations for both E(c) (25-160 klux) and d(10) (109-300 mm) existed; that using auxiliary lights reduced both E(c) and d(10) by up to 80% and 30%; that with segmented lights, uneven light distributions occurred; and that with colored LED lights shadow edges on the surgical field became colored. Objective illuminance measurements show a wide variation between lights and a superiority of main over auxiliary lights. Uneven light distributions and colored shadows indicate that LED lights still need to converge to an optimal design.

  13. The Evolution of Biological Complexity in Digital Organisms

    NASA Astrophysics Data System (ADS)

    Ofria, Charles

    2013-03-01

    When Darwin first proposed his theory of evolution by natural selection, he realized that it had a problem explaining the origins of traits of ``extreme perfection and complication'' such as the vertebrate eye. Critics of Darwin's theory have latched onto this perceived flaw as a proof that Darwinian evolution is impossible. In anticipation of this issue, Darwin described the perfect data needed to understand this process, but lamented that such data are ``scarcely ever possible'' to obtain. In this talk, I will discuss research where we use populations of digital organisms (self-replicating and evolving computer programs) to elucidate the genetic and evolutionary processes by which new, highly-complex traits arise, drawing inspiration directly from Darwin's wistful thinking and hypotheses. During the process of evolution in these fully-transparent computational environments we can measure the incorporation of new information into the genome, a process akin to a natural Maxwell's Demon, and identify the original source of any such information. We show that, as Darwin predicted, much of the information used to encode a complex trait was already in the genome as part of simpler evolved traits, and that many routes must be possible for a new complex trait to have a high probability of successfully evolving. In even more extreme examples of the evolution of complexity, we are now using these same principles to examine the evolutionary dynamics the drive major transitions in evolution; that is transitions to higher-levels of organization, which are some of the most complex evolutionary events to occur in nature. Finally, I will explore some of the implications of this research to other aspects of evolutionary biology and as well as ways that these evolutionary principles can be applied toward solving computational and engineering problems.

  14. Artificial intelligence in peer review: How can evolutionary computation support journal editors?

    PubMed

    Mrowinski, Maciej J; Fronczak, Piotr; Fronczak, Agata; Ausloos, Marcel; Nedic, Olgica

    2017-01-01

    With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times) are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors' workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy). Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems.

  15. Evolutionary fuzzy modeling human diagnostic decisions.

    PubMed

    Peña-Reyes, Carlos Andrés

    2004-05-01

    Fuzzy CoCo is a methodology, combining fuzzy logic and evolutionary computation, for constructing systems able to accurately predict the outcome of a human decision-making process, while providing an understandable explanation of the underlying reasoning. Fuzzy logic provides a formal framework for constructing systems exhibiting both good numeric performance (accuracy) and linguistic representation (interpretability). However, fuzzy modeling--meaning the construction of fuzzy systems--is an arduous task, demanding the identification of many parameters. To solve it, we use evolutionary computation techniques (specifically cooperative coevolution), which are widely used to search for adequate solutions in complex spaces. We have successfully applied the algorithm to model the decision processes involved in two breast cancer diagnostic problems, the WBCD problem and the Catalonia mammography interpretation problem, obtaining systems both of high performance and high interpretability. For the Catalonia problem, an evolved system was embedded within a Web-based tool-called COBRA-for aiding radiologists in mammography interpretation.

  16. Understanding Evolutionary Potential in Virtual CPU Instruction Set Architectures

    PubMed Central

    Bryson, David M.; Ofria, Charles

    2013-01-01

    We investigate fundamental decisions in the design of instruction set architectures for linear genetic programs that are used as both model systems in evolutionary biology and underlying solution representations in evolutionary computation. We subjected digital organisms with each tested architecture to seven different computational environments designed to present a range of evolutionary challenges. Our goal was to engineer a general purpose architecture that would be effective under a broad range of evolutionary conditions. We evaluated six different types of architectural features for the virtual CPUs: (1) genetic flexibility: we allowed digital organisms to more precisely modify the function of genetic instructions, (2) memory: we provided an increased number of registers in the virtual CPUs, (3) decoupled sensors and actuators: we separated input and output operations to enable greater control over data flow. We also tested a variety of methods to regulate expression: (4) explicit labels that allow programs to dynamically refer to specific genome positions, (5) position-relative search instructions, and (6) multiple new flow control instructions, including conditionals and jumps. Each of these features also adds complication to the instruction set and risks slowing evolution due to epistatic interactions. Two features (multiple argument specification and separated I/O) demonstrated substantial improvements in the majority of test environments, along with versions of each of the remaining architecture modifications that show significant improvements in multiple environments. However, some tested modifications were detrimental, though most exhibit no systematic effects on evolutionary potential, highlighting the robustness of digital evolution. Combined, these observations enhance our understanding of how instruction architecture impacts evolutionary potential, enabling the creation of architectures that support more rapid evolution of complex solutions to a broad range of challenges. PMID:24376669

  17. A framework for evolutionary systems biology

    PubMed Central

    Loewe, Laurence

    2009-01-01

    Background Many difficult problems in evolutionary genomics are related to mutations that have weak effects on fitness, as the consequences of mutations with large effects are often simple to predict. Current systems biology has accumulated much data on mutations with large effects and can predict the properties of knockout mutants in some systems. However experimental methods are too insensitive to observe small effects. Results Here I propose a novel framework that brings together evolutionary theory and current systems biology approaches in order to quantify small effects of mutations and their epistatic interactions in silico. Central to this approach is the definition of fitness correlates that can be computed in some current systems biology models employing the rigorous algorithms that are at the core of much work in computational systems biology. The framework exploits synergies between the realism of such models and the need to understand real systems in evolutionary theory. This framework can address many longstanding topics in evolutionary biology by defining various 'levels' of the adaptive landscape. Addressed topics include the distribution of mutational effects on fitness, as well as the nature of advantageous mutations, epistasis and robustness. Combining corresponding parameter estimates with population genetics models raises the possibility of testing evolutionary hypotheses at a new level of realism. Conclusion EvoSysBio is expected to lead to a more detailed understanding of the fundamental principles of life by combining knowledge about well-known biological systems from several disciplines. This will benefit both evolutionary theory and current systems biology. Understanding robustness by analysing distributions of mutational effects and epistasis is pivotal for drug design, cancer research, responsible genetic engineering in synthetic biology and many other practical applications. PMID:19239699

  18. A Bright Future for Evolutionary Methods in Drug Design.

    PubMed

    Le, Tu C; Winkler, David A

    2015-08-01

    Most medicinal chemists understand that chemical space is extremely large, essentially infinite. Although high-throughput experimental methods allow exploration of drug-like space more rapidly, they are still insufficient to fully exploit the opportunities that such large chemical space offers. Evolutionary methods can synergistically blend automated synthesis and characterization methods with computational design to identify promising regions of chemical space more efficiently. We describe how evolutionary methods are implemented, and provide examples of published drug development research in which these methods have generated molecules with increased efficacy. We anticipate that evolutionary methods will play an important role in future drug discovery. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. An evolutionary algorithm that constructs recurrent neural networks.

    PubMed

    Angeline, P J; Saunders, G M; Pollack, J B

    1994-01-01

    Standard methods for simultaneously inducing the structure and weights of recurrent neural networks limit every task to an assumed class of architectures. Such a simplification is necessary since the interactions between network structure and function are not well understood. Evolutionary computations, which include genetic algorithms and evolutionary programming, are population-based search methods that have shown promise in many similarly complex tasks. This paper argues that genetic algorithms are inappropriate for network acquisition and describes an evolutionary program, called GNARL, that simultaneously acquires both the structure and weights for recurrent networks. GNARL's empirical acquisition method allows for the emergence of complex behaviors and topologies that are potentially excluded by the artificial architectural constraints imposed in standard network induction methods.

  20. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  1. EvoluZion: A Computer Simulator for Teaching Genetic and Evolutionary Concepts

    ERIC Educational Resources Information Center

    Zurita, Adolfo R.

    2017-01-01

    EvoluZion is a forward-in-time genetic simulator developed in Java and designed to perform real time simulations on the evolutionary history of virtual organisms. These model organisms harbour a set of 13 genes that codify an equal number of phenotypic features. These genes change randomly during replication, and mutant genes can have null,…

  2. Generative Representations for Computer-Automated Evolutionary Design

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2006-01-01

    With the increasing computational power of computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design systems is the representation with which they encode designs. If the representation cannot encode a certain design, then the design system cannot produce it. To be able to produce new types of designs, and not just optimize pre-defined parameterizations, evolutionary design systems must use generative representations. Generative representations are assembly procedures, or algorithms, for constructing a design thereby allowing for truly novel design solutions to be encoded. In addition, by enabling modularity, regularity and hierarchy, the level of sophistication that can be evolved is increased. We demonstrate the advantages of generative representations on two different design domains: the evolution of spacecraft antennas and the evolution of 3D objects.

  3. Support vector machine prediction of enzyme function with conjoint triad feature and hierarchical context.

    PubMed

    Wang, Yong-Cui; Wang, Yong; Yang, Zhi-Xia; Deng, Nai-Yang

    2011-06-20

    Enzymes are known as the largest class of proteins and their functions are usually annotated by the Enzyme Commission (EC), which uses a hierarchy structure, i.e., four numbers separated by periods, to classify the function of enzymes. Automatically categorizing enzyme into the EC hierarchy is crucial to understand its specific molecular mechanism. In this paper, we introduce two key improvements in predicting enzyme function within the machine learning framework. One is to introduce the efficient sequence encoding methods for representing given proteins. The second one is to develop a structure-based prediction method with low computational complexity. In particular, we propose to use the conjoint triad feature (CTF) to represent the given protein sequences by considering not only the composition of amino acids but also the neighbor relationships in the sequence. Then we develop a support vector machine (SVM)-based method, named as SVMHL (SVM for hierarchy labels), to output enzyme function by fully considering the hierarchical structure of EC. The experimental results show that our SVMHL with the CTF outperforms SVMHL with the amino acid composition (AAC) feature both in predictive accuracy and Matthew's correlation coefficient (MCC). In addition, SVMHL with the CTF obtains the accuracy and MCC ranging from 81% to 98% and 0.82 to 0.98 when predicting the first three EC digits on a low-homologous enzyme dataset. We further demonstrate that our method outperforms the methods which do not take account of hierarchical relationship among enzyme categories and alternative methods which incorporate prior knowledge about inter-class relationships. Our structure-based prediction model, SVMHL with the CTF, reduces the computational complexity and outperforms the alternative approaches in enzyme function prediction. Therefore our new method will be a useful tool for enzyme function prediction community.

  4. Cancer evolution: mathematical models and computational inference.

    PubMed

    Beerenwinkel, Niko; Schwarz, Roland F; Gerstung, Moritz; Markowetz, Florian

    2015-01-01

    Cancer is a somatic evolutionary process characterized by the accumulation of mutations, which contribute to tumor growth, clinical progression, immune escape, and drug resistance development. Evolutionary theory can be used to analyze the dynamics of tumor cell populations and to make inference about the evolutionary history of a tumor from molecular data. We review recent approaches to modeling the evolution of cancer, including population dynamics models of tumor initiation and progression, phylogenetic methods to model the evolutionary relationship between tumor subclones, and probabilistic graphical models to describe dependencies among mutations. Evolutionary modeling helps to understand how tumors arise and will also play an increasingly important prognostic role in predicting disease progression and the outcome of medical interventions, such as targeted therapy. © The Author(s) 2014. Published by Oxford University Press on behalf of the Society of Systematic Biologists.

  5. Developing eThread pipeline using SAGA-pilot abstraction for large-scale structural bioinformatics.

    PubMed

    Ragothaman, Anjani; Boddu, Sairam Chowdary; Kim, Nayong; Feinstein, Wei; Brylinski, Michal; Jha, Shantenu; Kim, Joohyun

    2014-01-01

    While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread--a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure.

  6. Developing eThread Pipeline Using SAGA-Pilot Abstraction for Large-Scale Structural Bioinformatics

    PubMed Central

    Ragothaman, Anjani; Feinstein, Wei; Jha, Shantenu; Kim, Joohyun

    2014-01-01

    While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread—a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure. PMID:24995285

  7. The externally corrected coupled cluster approach with four- and five-body clusters from the CASSCF wave function.

    PubMed

    Xu, Enhua; Li, Shuhua

    2015-03-07

    An externally corrected CCSDt (coupled cluster with singles, doubles, and active triples) approach employing four- and five-body clusters from the complete active space self-consistent field (CASSCF) wave function (denoted as ecCCSDt-CASSCF) is presented. The quadruple and quintuple excitation amplitudes within the active space are extracted from the CASSCF wave function and then fed into the CCSDt-like equations, which can be solved in an iterative way as the standard CCSDt equations. With a size-extensive CASSCF reference function, the ecCCSDt-CASSCF method is size-extensive. When the CASSCF wave function is readily available, the computational cost of the ecCCSDt-CASSCF method scales as the popular CCSD method (if the number of active orbitals is small compared to the total number of orbitals). The ecCCSDt-CASSCF approach has been applied to investigate the potential energy surface for the simultaneous dissociation of two O-H bonds in H2O, the equilibrium distances and spectroscopic constants of 4 diatomic molecules (F2(+), O2(+), Be2, and NiC), and the reaction barriers for the automerization reaction of cyclobutadiene and the Cl + O3 → ClO + O2 reaction. In most cases, the ecCCSDt-CASSCF approach can provide better results than the CASPT2 (second order perturbation theory with a CASSCF reference function) and CCSDT methods.

  8. Computational evolution: taking liberties.

    PubMed

    Correia, Luís

    2010-09-01

    Evolution has, for a long time, inspired computer scientists to produce computer models mimicking its behavior. Evolutionary algorithm (EA) is one of the areas where this approach has flourished. EAs have been used to model and study evolution, but they have been especially developed for their aptitude as optimization tools for engineering. Developed models are quite simple in comparison with their natural sources of inspiration. However, since EAs run on computers, we have the freedom, especially in optimization models, to test approaches both realistic and outright speculative, from the biological point of view. In this article, we discuss different common evolutionary algorithm models, and then present some alternatives of interest. These include biologically inspired models, such as co-evolution and, in particular, symbiogenetics and outright artificial operators and representations. In each case, the advantages of the modifications to the standard model are identified. The other area of computational evolution, which has allowed us to study basic principles of evolution and ecology dynamics, is the development of artificial life platforms for open-ended evolution of artificial organisms. With these platforms, biologists can test theories by directly manipulating individuals and operators, observing the resulting effects in a realistic way. An overview of the most prominent of such environments is also presented. If instead of artificial platforms we use the real world for evolving artificial life, then we are dealing with evolutionary robotics (ERs). A brief description of this area is presented, analyzing its relations to biology. Finally, we present the conclusions and identify future research avenues in the frontier of computation and biology. Hopefully, this will help to draw the attention of more biologists and computer scientists to the benefits of such interdisciplinary research.

  9. Understanding sequence similarity and framework analysis between centromere proteins using computational biology.

    PubMed

    Doss, C George Priya; Chakrabarty, Chiranjib; Debajyoti, C; Debottam, S

    2014-11-01

    Certain mysteries pointing toward their recruitment pathways, cell cycle regulation mechanisms, spindle checkpoint assembly, and chromosome segregation process are considered the centre of attraction in cancer research. In modern times, with the established databases, ranges of computational platforms have provided a platform to examine almost all the physiological and biochemical evidences in disease-associated phenotypes. Using existing computational methods, we have utilized the amino acid residues to understand the similarity within the evolutionary variance of different associated centromere proteins. This study related to sequence similarity, protein-protein networking, co-expression analysis, and evolutionary trajectory of centromere proteins will speed up the understanding about centromere biology and will create a road map for upcoming researchers who are initiating their work of clinical sequencing using centromere proteins.

  10. Shell Games.

    ERIC Educational Resources Information Center

    Atkinson, Bill

    1982-01-01

    The author critiques the program design and educational aspects of the Shell Games, a program developed by Apple Computer, Inc., which can be used by the teacher to design objective tests for adaptation to specific assessment needs. (For related articles, see EC 142 959-962.) (Author)

  11. 14 CFR 25.531 - Hull and main float takeoff condition.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... wing lift is assumed to be zero; and (b) A downward inertia load, corresponding to a load factor computed from the following formula, must be applied: EC28SE91.038 where— n=inertia load factor; C TO...

  12. 14 CFR 25.531 - Hull and main float takeoff condition.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... wing lift is assumed to be zero; and (b) A downward inertia load, corresponding to a load factor computed from the following formula, must be applied: EC28SE91.038 where— n=inertia load factor; C TO...

  13. 14 CFR 25.531 - Hull and main float takeoff condition.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... wing lift is assumed to be zero; and (b) A downward inertia load, corresponding to a load factor computed from the following formula, must be applied: EC28SE91.038 where— n=inertia load factor; C TO...

  14. 14 CFR 25.531 - Hull and main float takeoff condition.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... wing lift is assumed to be zero; and (b) A downward inertia load, corresponding to a load factor computed from the following formula, must be applied: EC28SE91.038 where— n=inertia load factor; C TO...

  15. Bipartite Graphs as Models of Population Structures in Evolutionary Multiplayer Games

    PubMed Central

    Peña, Jorge; Rochat, Yannick

    2012-01-01

    By combining evolutionary game theory and graph theory, “games on graphs” study the evolutionary dynamics of frequency-dependent selection in population structures modeled as geographical or social networks. Networks are usually represented by means of unipartite graphs, and social interactions by two-person games such as the famous prisoner’s dilemma. Unipartite graphs have also been used for modeling interactions going beyond pairwise interactions. In this paper, we argue that bipartite graphs are a better alternative to unipartite graphs for describing population structures in evolutionary multiplayer games. To illustrate this point, we make use of bipartite graphs to investigate, by means of computer simulations, the evolution of cooperation under the conventional and the distributed N-person prisoner’s dilemma. We show that several implicit assumptions arising from the standard approach based on unipartite graphs (such as the definition of replacement neighborhoods, the intertwining of individual and group diversity, and the large overlap of interaction neighborhoods) can have a large impact on the resulting evolutionary dynamics. Our work provides a clear example of the importance of construction procedures in games on graphs, of the suitability of bigraphs and hypergraphs for computational modeling, and of the importance of concepts from social network analysis such as centrality, centralization and bipartite clustering for the understanding of dynamical processes occurring on networked population structures. PMID:22970237

  16. Cloud Computing: An Overview

    NASA Astrophysics Data System (ADS)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  17. The Handicap Principle for Trust in Computer Security, the Semantic Web and Social Networking

    NASA Astrophysics Data System (ADS)

    Ma, Zhanshan (Sam); Krings, Axel W.; Hung, Chih-Cheng

    Communication is a fundamental function of life, and it exists in almost all living things: from single-cell bacteria to human beings. Communication, together with competition and cooperation,arethree fundamental processes in nature. Computer scientists are familiar with the study of competition or 'struggle for life' through Darwin's evolutionary theory, or even evolutionary computing. They may be equally familiar with the study of cooperation or altruism through the Prisoner's Dilemma (PD) game. However, they are likely to be less familiar with the theory of animal communication. The objective of this article is three-fold: (i) To suggest that the study of animal communication, especially the honesty (reliability) of animal communication, in which some significant advances in behavioral biology have been achieved in the last three decades, should be on the verge to spawn important cross-disciplinary research similar to that generated by the study of cooperation with the PD game. One of the far-reaching advances in the field is marked by the publication of "The Handicap Principle: a Missing Piece of Darwin's Puzzle" by Zahavi (1997). The 'Handicap' principle [34][35], which states that communication signals must be costly in some proper way to be reliable (honest), is best elucidated with evolutionary games, e.g., Sir Philip Sidney (SPS) game [23]. Accordingly, we suggest that the Handicap principle may serve as a fundamental paradigm for trust research in computer science. (ii) To suggest to computer scientists that their expertise in modeling computer networks may help behavioral biologists in their study of the reliability of animal communication networks. This is largely due to the historical reason that, until the last decade, animal communication was studied with the dyadic paradigm (sender-receiver) rather than with the network paradigm. (iii) To pose several open questions, the answers to which may bear some refreshing insights to trust research in computer science, especially secure and resilient computing, the semantic web, and social networking. One important thread unifying the three aspects is the evolutionary game theory modeling or its extensions with survival analysis and agreement algorithms [19][20], which offer powerful game models for describing time-, space-, and covariate-dependent frailty (uncertainty and vulnerability) and deception (honesty).

  18. δ-Conotoxin SuVIA suggests an evolutionary link between ancestral predator defence and the origin of fish-hunting behaviour in carnivorous cone snails

    PubMed Central

    Jin, Ai-Hua; Israel, Mathilde R.; Inserra, Marco C.; Smith, Jennifer J.; Lewis, Richard J.; Alewood, Paul F.; Vetter, Irina; Dutertre, Sébastien

    2015-01-01

    Some venomous cone snails feed on small fishes using an immobilizing combination of synergistic venom peptides that target Kv and Nav channels. As part of this envenomation strategy, δ-conotoxins are potent ichtyotoxins that enhance Nav channel function. δ-Conotoxins belong to an ancient and widely distributed gene superfamily, but any evolutionary link from ancestral worm-eating cone snails to modern piscivorous species has not been elucidated. Here, we report the discovery of SuVIA, a potent vertebrate-active δ-conotoxin characterized from a vermivorous cone snail (Conus suturatus). SuVIA is equipotent at hNaV1.3, hNaV1.4 and hNaV1.6 with EC50s in the low nanomolar range. SuVIA also increased peak hNaV1.7 current by approximately 75% and shifted the voltage-dependence of activation to more hyperpolarized potentials from –15 mV to –25 mV, with little effect on the voltage-dependence of inactivation. Interestingly, the proximal venom gland expression and pain-inducing effect of SuVIA in mammals suggest that δ-conotoxins in vermivorous cone snails play a defensive role against higher order vertebrates. We propose that δ-conotoxins originally evolved in ancestral vermivorous cones to defend against larger predators including fishes have been repurposed to facilitate a shift to piscivorous behaviour, suggesting an unexpected underlying mechanism for this remarkable evolutionary transition. PMID:26156767

  19. Computer-Automated Evolution of Spacecraft X-Band Antennas

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Homby, Gregory S.; Linden, Derek S.

    2010-01-01

    A document discusses the use of computer- aided evolution in arriving at a design for X-band communication antennas for NASA s three Space Technology 5 (ST5) satellites, which were launched on March 22, 2006. Two evolutionary algorithms, incorporating different representations of the antenna design and different fitness functions, were used to automatically design and optimize an X-band antenna design. A set of antenna designs satisfying initial ST5 mission requirements was evolved by use these algorithms. The two best antennas - one from each evolutionary algorithm - were built. During flight-qualification testing of these antennas, the mission requirements were changed. After minimal changes in the evolutionary algorithms - mostly in the fitness functions - new antenna designs satisfying the changed mission requirements were evolved and within one month of this change, two new antennas were designed and prototypes of the antennas were built and tested. One of these newly evolved antennas was approved for deployment on the ST5 mission, and flight-qualified versions of this design were built and installed on the spacecraft. At the time of writing the document, these antennas were the first computer-evolved hardware in outer space.

  20. Optimizing a reconfigurable material via evolutionary computation

    NASA Astrophysics Data System (ADS)

    Wilken, Sam; Miskin, Marc Z.; Jaeger, Heinrich M.

    2015-08-01

    Rapid prototyping by combining evolutionary computation with simulations is becoming a powerful tool for solving complex design problems in materials science. This method of optimization operates in a virtual design space that simulates potential material behaviors and after completion needs to be validated by experiment. However, in principle an evolutionary optimizer can also operate on an actual physical structure or laboratory experiment directly, provided the relevant material parameters can be accessed by the optimizer and information about the material's performance can be updated by direct measurements. Here we provide a proof of concept of such direct, physical optimization by showing how a reconfigurable, highly nonlinear material can be tuned to respond to impact. We report on an entirely computer controlled laboratory experiment in which a 6 ×6 grid of electromagnets creates a magnetic field pattern that tunes the local rigidity of a concentrated suspension of ferrofluid and iron filings. A genetic algorithm is implemented and tasked to find field patterns that minimize the force transmitted through the suspension. Searching within a space of roughly 1010 possible configurations, after testing only 1500 independent trials the algorithm identifies an optimized configuration of layered rigid and compliant regions.

  1. Investigation on the Inertance Tubes of Pulse Tube Cryocooler Without Reservoir

    NASA Astrophysics Data System (ADS)

    Liu, Y. J.; Yang, L. W.; Liang, J. T.; Hong, G. T.

    2010-04-01

    Phase angle is of vital importance for high-efficiency pulse tube cryocoolers (PTCs). Inertance tube as the main phase shifter is useful for the PTCs to obtain appropriate phase angle. Experiments of inertance tube without reservoir under variable frequency, variable length and diameter of inertance tube and variable pressure amplitude are investigated respectively. In addition, the authors used DeltaEC, a computer program to predict the performance of low-amplitude thermoacoustic engines, to simulate the effects of inertance tube without reservoir. According to the comparison of experiments and theoretical simulations, DeltaEC method is feasible and effective to direct and improve the design of inertance tubes.

  2. Evaluation of Generation Alternation Models in Evolutionary Robotics

    NASA Astrophysics Data System (ADS)

    Oiso, Masashi; Matsumura, Yoshiyuki; Yasuda, Toshiyuki; Ohkura, Kazuhiro

    For efficient implementation of Evolutionary Algorithms (EA) to a desktop grid computing environment, we propose a new generation alternation model called Grid-Oriented-Deletion (GOD) based on comparison with the conventional techniques. In previous research, generation alternation models are generally evaluated by using test functions. However, their exploration performance on the real problems such as Evolutionary Robotics (ER) has not been made very clear yet. Therefore we investigate the relationship between the exploration performance of EA on an ER problem and its generation alternation model. We applied four generation alternation models to the Evolutionary Multi-Robotics (EMR), which is the package-pushing problem to investigate their exploration performance. The results show that GOD is more effective than the other conventional models.

  3. Scheduling Earth Observing Fleets Using Evolutionary Algorithms: Problem Description and Approach

    NASA Technical Reports Server (NTRS)

    Globus, Al; Crawford, James; Lohn, Jason; Morris, Robert; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We describe work in progress concerning multi-instrument, multi-satellite scheduling. Most, although not all, Earth observing instruments currently in orbit are unique. In the relatively near future, however, we expect to see fleets of Earth observing spacecraft, many carrying nearly identical instruments. This presents a substantially new scheduling challenge. Inspired by successful commercial applications of evolutionary algorithms in scheduling domains, this paper presents work in progress regarding the use of evolutionary algorithms to solve a set of Earth observing related model problems. Both the model problems and the software are described. Since the larger problems will require substantial computation and evolutionary algorithms are embarrassingly parallel, we discuss our parallelization techniques using dedicated and cycle-scavenged workstations.

  4. Cloud BioLinux: pre-configured and on-demand bioinformatics computing for the genomics community.

    PubMed

    Krampis, Konstantinos; Booth, Tim; Chapman, Brad; Tiwari, Bela; Bicak, Mesude; Field, Dawn; Nelson, Karen E

    2012-03-19

    A steep drop in the cost of next-generation sequencing during recent years has made the technology affordable to the majority of researchers, but downstream bioinformatic analysis still poses a resource bottleneck for smaller laboratories and institutes that do not have access to substantial computational resources. Sequencing instruments are typically bundled with only the minimal processing and storage capacity required for data capture during sequencing runs. Given the scale of sequence datasets, scientific value cannot be obtained from acquiring a sequencer unless it is accompanied by an equal investment in informatics infrastructure. Cloud BioLinux is a publicly accessible Virtual Machine (VM) that enables scientists to quickly provision on-demand infrastructures for high-performance bioinformatics computing using cloud platforms. Users have instant access to a range of pre-configured command line and graphical software applications, including a full-featured desktop interface, documentation and over 135 bioinformatics packages for applications including sequence alignment, clustering, assembly, display, editing, and phylogeny. Each tool's functionality is fully described in the documentation directly accessible from the graphical interface of the VM. Besides the Amazon EC2 cloud, we have started instances of Cloud BioLinux on a private Eucalyptus cloud installed at the J. Craig Venter Institute, and demonstrated access to the bioinformatic tools interface through a remote connection to EC2 instances from a local desktop computer. Documentation for using Cloud BioLinux on EC2 is available from our project website, while a Eucalyptus cloud image and VirtualBox Appliance is also publicly available for download and use by researchers with access to private clouds. Cloud BioLinux provides a platform for developing bioinformatics infrastructures on the cloud. An automated and configurable process builds Virtual Machines, allowing the development of highly customized versions from a shared code base. This shared community toolkit enables application specific analysis platforms on the cloud by minimizing the effort required to prepare and maintain them.

  5. Cloud BioLinux: pre-configured and on-demand bioinformatics computing for the genomics community

    PubMed Central

    2012-01-01

    Background A steep drop in the cost of next-generation sequencing during recent years has made the technology affordable to the majority of researchers, but downstream bioinformatic analysis still poses a resource bottleneck for smaller laboratories and institutes that do not have access to substantial computational resources. Sequencing instruments are typically bundled with only the minimal processing and storage capacity required for data capture during sequencing runs. Given the scale of sequence datasets, scientific value cannot be obtained from acquiring a sequencer unless it is accompanied by an equal investment in informatics infrastructure. Results Cloud BioLinux is a publicly accessible Virtual Machine (VM) that enables scientists to quickly provision on-demand infrastructures for high-performance bioinformatics computing using cloud platforms. Users have instant access to a range of pre-configured command line and graphical software applications, including a full-featured desktop interface, documentation and over 135 bioinformatics packages for applications including sequence alignment, clustering, assembly, display, editing, and phylogeny. Each tool's functionality is fully described in the documentation directly accessible from the graphical interface of the VM. Besides the Amazon EC2 cloud, we have started instances of Cloud BioLinux on a private Eucalyptus cloud installed at the J. Craig Venter Institute, and demonstrated access to the bioinformatic tools interface through a remote connection to EC2 instances from a local desktop computer. Documentation for using Cloud BioLinux on EC2 is available from our project website, while a Eucalyptus cloud image and VirtualBox Appliance is also publicly available for download and use by researchers with access to private clouds. Conclusions Cloud BioLinux provides a platform for developing bioinformatics infrastructures on the cloud. An automated and configurable process builds Virtual Machines, allowing the development of highly customized versions from a shared code base. This shared community toolkit enables application specific analysis platforms on the cloud by minimizing the effort required to prepare and maintain them. PMID:22429538

  6. Nanotube Heterojunctions and Endo-Fullerenes for Nanoelectronics

    NASA Technical Reports Server (NTRS)

    Srivastava, Deepak; Menon, M.; Andriotis, Antonis; Cho, K.; Park, Jun; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    Topics discussed include: (1) Light-Weight Multi-Functional Materials: Nanomechanics; Nanotubes and Composites; Thermal/Chemical/Electrical Characterization; (2) Biomimetic/Revolutionary Concepts: Evolutionary Computing and Sensing; Self-Heating Materials; (3) Central Computing System: Molecular Electronics; Materials for Quantum Bits; and (4) Molecular Machines.

  7. FOURIER ANALYSIS OF BLAZAR VARIABILITY: KLEIN–NISHINA EFFECTS AND THE JET SCATTERING ENVIRONMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finke, Justin D.; Becker, Peter A., E-mail: justin.finke@nrl.navy.mil, E-mail: pbecker@gmu.edu

    The strong variability of blazars can be characterized by power spectral densities (PSDs) and Fourier frequency-dependent time lags. In previous work, we created a new theoretical formalism for describing the PSDs and time lags produced via a combination of stochastic particle injection and emission via the synchrotron, synchrotron self-Compton, and external Compton (EC) processes. This formalism used the Thomson cross section and simple δ-function approximations to model the synchrotron and Compton emissivities. Here we expand upon this work, using the full Compton cross section and detailed and accurate emissivities. Our results indicate good agreement between the PSDs computed using themore » δ-function approximations and those computed using the accurate expressions, provided the observed photons are produced primarily by electrons with energies exceeding the lower limit of the injected particle population. Breaks are found in the PSDs at frequencies corresponding to the cooling timescales of the electrons primarily responsible for the observed emission, and the associated time lags are related to the difference in electron cooling timescales between the two energy channels, as expected. If the electron cooling timescales can be determined from the observed time lags and/or the observed EC PSDs, then one could in principle use the method developed here to determine the energy of the external seed photon source for EC, which is an important unsolved problem in blazar physics.« less

  8. Impaired neurogenesis of the dentate gyrus is associated with pattern separation deficits: A computational study.

    PubMed

    Faghihi, Faramarz; Moustafa, Ahmed A

    2016-09-01

    The separation of input patterns received from the entorhinal cortex (EC) by the dentate gyrus (DG) is a well-known critical step of information processing in the hippocampus. Although the role of interneurons in separation pattern efficiency of the DG has been theoretically known, the balance of neurogenesis of excitatory neurons and interneurons as well as its potential role in information processing in the DG is not fully understood. In this work, we study separation efficiency of the DG for different rates of neurogenesis of interneurons and excitatory neurons using a novel computational model in which we assume an increase in the synaptic efficacy between excitatory neurons and interneurons and then its decay over time. Information processing in the EC and DG was simulated as information flow in a two layer feed-forward neural network. The neurogenesis rate was modeled as the percentage of new born neurons added to the neuronal population in each time bin. The results show an important role of an optimal neurogenesis rate of interneurons and excitatory neurons in the DG in efficient separation of inputs from the EC in pattern separation tasks. The model predicts that any deviation of the optimal values of neurogenesis rates leads to different decreased levels of the separation deficits of the DG which influences its function to encode memory.

  9. Vision 2010: The Future of Higher Education Business and Learning Applications

    ERIC Educational Resources Information Center

    Carey, Patrick; Gleason, Bernard

    2006-01-01

    The global software industry is in the midst of a major evolutionary shift--one based on open computing--and this trend, like many transformative trends in technology, is being led by the IT staffs and academic computing faculty of the higher education industry. The elements of this open computing approach are open source, open standards, open…

  10. Multi-Objective UAV Mission Planning Using Evolutionary Computation

    DTIC Science & Technology

    2008-03-01

    on a Solution Space. . . . . . . . . . . . . . . . . . . . 41 4.3. Crowding distance calculation. Dark points are non-dominated solutions. [14...SPEA2 was devel- oped by Zitzler [64] as an improvement to the original SPEA algorithm [65]. SPEA2 Figure 4.3: Crowding distance calculation. Dark ...thesis, Los Angeles, CA, USA, 2003. Adviser-Maja J. Mataric . 114 21. Homberger, Joerg and Hermann Gehring. “Two Evolutionary Metaheuristics for the

  11. An Evolutionary Algorithm to Generate Ellipsoid Detectors for Negative Selection

    DTIC Science & Technology

    2005-03-21

    of Congress on Evolutionary Computation. Honolulu,. 58. Lamont, Gary B., Robert E. Marmelstein, and David A. Van Veldhuizen . A Distributed Architecture...antibody and an antigen is a function of several processes including electrostatic interactions, hydrogen bonding, van der Waals interaction, and others [20...Kelly, Patrick M., Don R. Hush, and James M. White. “An Adaptive Algorithm for Modifying Hyperellipsoidal Decision Surfaces”. Journal of Artificial

  12. Association between left atrial phasic conduit function and early atrial fibrillation recurrence in patients undergoing electrical cardioversion.

    PubMed

    Degiovanni, Anna; Boggio, Enrico; Prenna, Eleonora; Sartori, Chiara; De Vecchi, Federica; Marino, Paolo N

    2018-04-01

    Diastolic dysfunction promotes atrial fibrillation (AF) inducing left atrial (LA) remodeling, with chamber dilation and fibrosis. Predominance of LA phasic conduit (LAC) function should reflect not only chamber alterations but also underlying left ventricular (LV) filling impairment. Thus, LAC was tested as possible predictor of early AF relapse after electrical cardioversion (EC). 96 consecutive patients, who underwent EC for persistent non-valvular AF, were prospectively enrolled. Immediately after successful EC (3 h ± 15 min), an echocardiographic apical four-chamber view was acquired with transmitral velocities, annular tissue Doppler and simultaneous LV and LA three-dimensional full-volume datasets. Then, from LA-LV volumetric curves we computed LAC as: [(LV maximum - LV minimum) - (LA maximum - LA minimum) volume], expressed as % LV stroke volume. LA pump, immediately post-EC, was assumed and verified as being negligible. Sinus rhythm persistence at 1 month was checked with ECG-Holter monitoring. At 1 month 62 patients were in sinus rhythm and 34 in AF. AF patients presented pre-EC higher E/é values (p = 0.012), no major LA volume differences (p = NS), but a stiffer LV cavity (p = 0.012) for a comparable LV capacitance (p = 0.461). Conduit contributed more (p < 0.001) to LV stroke volume in AF subpopulation. Multiple regression revealed LAC as the most significant AF predictor (p = 0.013), even after correction for biometric characteristics and pharmacotherapy (p = 0.008). Our data suggest that LAC larger contribution to LV filling soon after EC reflects LA-LV stiffening, which skews atrioventricular interaction leading to AF perpetuation and makes conduit dominance a powerful predictor of early AF recurrence.

  13. DISSPLA plotting routines for the G-189A EC/LS computer program

    NASA Technical Reports Server (NTRS)

    Simpson, C. D.

    1982-01-01

    Data from a G-189A execution is formatted and plotted. The plotting may be done at the time of execution of the program. DISSPLA plot packages are used. The user has the choice of FR80 or TEKTRONIX output.

  14. Measuring Evapotranspiration in Urban Irrigated Lawns in Two Kansas Cities

    NASA Astrophysics Data System (ADS)

    Shonkwiler, K. B.; Bremer, D.; Ham, J. M.

    2011-12-01

    Conservation of water is becoming increasingly critical in many metropolitan areas. The use of automated irrigation systems for the maintenance of lawns and landscapes is rising and these systems are typically maladjusted to apply more water than necessary, resulting in water wastage. Provision of accurate estimates of actual lawn water use may assist urbanites in conserving water through better adjustment of automatic irrigation systems. Micrometeorological methods may help determine actual lawn water use by measuring evapotranspiration (ET) from urban lawns. From April - August of 2011, four small tripod-mounted weather stations (tripods, five total) were deployed in twelve residential landscapes in the Kansas cities of Manhattan (MHK) and Wichita (ICT) in the USA (six properties in each city). Each tripod was instrumented to estimate reference crop evapotranspiration (ETo) via the FAO-56 method. During tripod deployment in residential lawns, actual evapotranspiration (ETactual) was measured nearby using a stationary, trailer-mounted eddy covariance (EC) station. The EC station sampled well-watered turf at the K-State Rocky Ford Turfgrass Center within 5 km of the study properties in MHK, and was also deployed at a commercial sod farm 15 - 40 km from the study residences in the greater ICT metro area. The fifth tripod was deployed in the source area of the EC station to estimate ETo in conjunction with tripods in the lawns (i.e., to serve as a reference). Data from EC allowed for computation of a so-called lawn coefficient (Kc) by determining the ratio of ETo from the tripods in residential lawns to ETo from the EC station (ETo,EC); hence, Kc = ETo,tripod / ETo,EC. Using this method, ETactual can be estimated for individual tripods within a lawn. Data suggests that it may be more accurate to quantify ET within individual lawns by microclimate (i.e., determine coefficients for "shaded" and "open/unshaded" portions of a lawn). By finding microclimate coefficients, estimates of ETactual for individual lawns can be tailored to the specific characteristics of each property.

  15. Cogeneration Technology Alternatives Study (CTAS). Volume 6: Computer data. Part 2: Residual-fired nocogeneration process boiler

    NASA Technical Reports Server (NTRS)

    Knightly, W. F.

    1980-01-01

    About fifty industrial processes from the largest energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidate which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum based residual and distillate liquid fuels, and low Btu gas obtained through the on site gasification of coal. Computer generated reports of the fuel consumption and savings, capital costs, economics and emissions of the cogeneration energy conversion systems (ECS's) heat and power matched to the individual industrial processes are presented. National fuel and emissions savings are also reported for each ECS assuming it alone is implemented. Two nocogeneration base cases are included: coal fired and residual fired process boilers.

  16. Cogeneration Technology Alternatives Study (CTAS). Volume 6: Computer data. Part 1: Coal-fired nocogeneration process boiler, section A

    NASA Technical Reports Server (NTRS)

    Knightly, W. F.

    1980-01-01

    About fifty industrial processes from the largest energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidate which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum based residual and distillate liquid fuels, and low Btu gas obtained through the on site gasification of coal. Computer generated reports of the fuels consumption and savings, capital costs, economics and emissions of the cogeneration energy conversion systems (ECS's) heat and power matched to the individual industrial processes are presented. National fuel and emissions savings are also reported for each ECS assuming it alone is implemented. Two nocogeneration base cases are included: coal fired and residual fired process boilers.

  17. Cogeneration Technology Alternatives Study (CTAS). Volume 6: Computer data. Part 1: Coal-fired nocogeneration process boiler, section B

    NASA Technical Reports Server (NTRS)

    Knightly, W. F.

    1980-01-01

    About fifty industrial processes from the largest energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidate which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum based residual and distillate liquid fuels, and low Btu gas obtained through the on site gasification of coal. Computer generated reports of the fuel consumption and savings, capital costs, economics and emissions of the cogeneration energy conversion systems (ECS's) heat and power matched to the individual industrial processes are presented. National fuel and emissions savings are also reported for each ECS assuming it alone is implemented. Two nocogeneration base cases are included: coal fired and residual fired process boilers.

  18. Evolutionary Models for Simple Biosystems

    NASA Astrophysics Data System (ADS)

    Bagnoli, Franco

    The concept of evolutionary development of structures constituted a real revolution in biology: it was possible to understand how the very complex structures of life can arise in an out-of-equilibrium system. The investigation of such systems has shown that indeed, systems under a flux of energy or matter can self-organize into complex patterns, think for instance to Rayleigh-Bernard convection, Liesegang rings, patterns formed by granular systems under shear. Following this line, one could characterize life as a state of matter, characterized by the slow, continuous process that we call evolution. In this paper we try to identify the organizational level of life, that spans several orders of magnitude from the elementary constituents to whole ecosystems. Although similar structures can be found in other contexts like ideas (memes) in neural systems and self-replicating elements (computer viruses, worms, etc.) in computer systems, we shall concentrate on biological evolutionary structure, and try to put into evidence the role and the emergence of network structure in such systems.

  19. Artificial intelligence in peer review: How can evolutionary computation support journal editors?

    PubMed Central

    Fronczak, Piotr; Fronczak, Agata; Ausloos, Marcel; Nedic, Olgica

    2017-01-01

    With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times) are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors’ workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy). Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems. PMID:28931033

  20. [A case of eosinophilic cystitis mimicking an invasive bladder cancer].

    PubMed

    Okazaki, Satoshi; Hori, Jun-Ichi; Kita, Masafumi; Yamaguchi, Satoshi; Kawakami, Norihiro; Kakizaki, Hidehiro

    2014-12-01

    A 60-year-old woman was referred to our hospital because of gross hematuria, right lumbar pain and lower abdominal pain. Computed tomography (CT) scan revealed hydronephrosis of the right kidney, irregular bladder wall thickening at the right lateral and posterior portion and external iliac lymph node swelling of the right side. Laboratory data revealed disseminated intravascular coagulation syndrome (DIC) and eosinophilia. Because she developed a high fever that was caused by acute obstructive pyelonephritis of the right kidney, percutaneous nephrostomy was placed and the therapy for DIC was initiated. Pathological examination of transurethral resection of bladder tumor performed twice showed no malignancy but inflammatory infiltration of many eosinocytes, leading to the diagnosis of eosinophilic cystitis (EC). We considered the possibility of allergic reaction to the drugs she was taking as the etiology of EC and discontinued all drugs. Although eosinophilia was resolved afterward, she then developed brain infarction, followed by cerebral hemorrhage. She was transferred to a rehabilitation hospital for long-term care. CT scan that was performed 4 months after the initial presentation showed the resolution of hydronephrosis of the right kidney and external iliac lymph node swelling and the improvement of bladder wall thickness. Hydronephrosis of the right kidney has not recurred after removing the nephrostomy catheter. EC is a rare condition that could mimic an invasive bladder cancer. EC should be considered if bladder tumor is associated with eosinophilia. Therapeutic consideration for thromboembolic events should be made in patients with EC.

  1. How do operating conditions affect As(III) removal by iron electrocoagulation?

    PubMed

    Delaire, Caroline; Amrose, Susan; Zhang, Minghui; Hake, James; Gadgil, Ashok

    2017-04-01

    Iron electrocoagulation (Fe-EC) has been shown to effectively remove arsenic from contaminated groundwater at low cost and has the potential to improve access to safe drinking water for millions of people. Understanding how operating conditions, such as the Fe dosage rate and the O 2 recharge rate, affect arsenic removal at different pH values is crucial to maximize the performance of Fe-EC under economic constraints. In this work, we improved upon an existing computational model to investigate the combined effects of pH, Fe dosage rate, and O 2 recharge rate on arsenic removal in Fe-EC. We showed that the impact of the Fe dosage rate strongly depends on pH and on the O 2 recharge rate, which has important practical implications. We identified the process limiting arsenic removal (As(III) oxidation versus As(V) adsorption) at different pH values, which allowed us to interpret the effect of operating conditions on Fe-EC performance. Finally, we assessed the robustness of the trends predicted by the model, which assumes a constant pH, against lab experiments reproducing more realistic conditions where pH is allowed to drift during treatment as a result of equilibration with atmospheric CO 2 . Our results provide a nuanced understanding of how operating conditions impact arsenic removal by Fe-EC and can inform decisions regarding the operation of this technology in a range of groundwaters. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. S1P1 inhibits sprouting angiogenesis during vascular development.

    PubMed

    Ben Shoham, Adi; Malkinson, Guy; Krief, Sharon; Shwartz, Yulia; Ely, Yona; Ferrara, Napoleone; Yaniv, Karina; Zelzer, Elazar

    2012-10-01

    Coordination between the vascular system and forming organs is essential for proper embryonic development. The vasculature expands by sprouting angiogenesis, during which tip cells form filopodia that incorporate into capillary loops. Although several molecules, such as vascular endothelial growth factor A (Vegfa), are known to induce sprouting, the mechanism that terminates this process to ensure neovessel stability is still unknown. Sphingosine-1-phosphate receptor 1 (S1P(1)) has been shown to mediate interaction between endothelial and mural cells during vascular maturation. In vitro studies have identified S1P(1) as a pro-angiogenic factor. Here, we show that S1P(1) acts as an endothelial cell (EC)-autonomous negative regulator of sprouting angiogenesis during vascular development. Severe aberrations in vessel size and excessive sprouting found in limbs of S1P(1)-null mouse embryos before vessel maturation imply a previously unknown, mural cell-independent role for S1P(1) as an anti-angiogenic factor. A similar phenotype observed when S1P(1) expression was blocked specifically in ECs indicates that the effect of S1P(1) on sprouting is EC-autonomous. Comparable vascular abnormalities in S1p(1) knockdown zebrafish embryos suggest cross-species evolutionary conservation of this mechanism. Finally, genetic interaction between S1P(1) and Vegfa suggests that these factors interplay to regulate vascular development, as Vegfa promotes sprouting whereas S1P(1) inhibits it to prevent excessive sprouting and fusion of neovessels. More broadly, because S1P, the ligand of S1P(1), is blood-borne, our findings suggest a new mode of regulation of angiogenesis, whereby blood flow closes a negative feedback loop that inhibits sprouting angiogenesis once the vascular bed is established and functional.

  3. Nonspinning numerical relativity waveform surrogates: assessing the model

    NASA Astrophysics Data System (ADS)

    Field, Scott; Blackman, Jonathan; Galley, Chad; Scheel, Mark; Szilagyi, Bela; Tiglio, Manuel

    2015-04-01

    Recently, multi-modal gravitational waveform surrogate models have been built directly from data numerically generated by the Spectral Einstein Code (SpEC). I will describe ways in which the surrogate model error can be quantified. This task, in turn, requires (i) characterizing differences between waveforms computed by SpEC with those predicted by the surrogate model and (ii) estimating errors associated with the SpEC waveforms from which the surrogate is built. Both pieces can have numerous sources of numerical and systematic errors. We make an attempt to study the most dominant error sources and, ultimately, the surrogate model's fidelity. These investigations yield information about the surrogate model's uncertainty as a function of time (or frequency) and parameter, and could be useful in parameter estimation studies which seek to incorporate model error. Finally, I will conclude by comparing the numerical relativity surrogate model to other inspiral-merger-ringdown models. A companion talk will cover the building of multi-modal surrogate models.

  4. Embodied cognition for autonomous interactive robots.

    PubMed

    Hoffman, Guy

    2012-10-01

    In the past, notions of embodiment have been applied to robotics mainly in the realm of very simple robots, and supporting low-level mechanisms such as dynamics and navigation. In contrast, most human-like, interactive, and socially adept robotic systems turn away from embodiment and use amodal, symbolic, and modular approaches to cognition and interaction. At the same time, recent research in Embodied Cognition (EC) is spanning an increasing number of complex cognitive processes, including language, nonverbal communication, learning, and social behavior. This article suggests adopting a modern EC approach for autonomous robots interacting with humans. In particular, we present three core principles from EC that may be applicable to such robots: (a) modal perceptual representation, (b) action/perception and action/cognition integration, and (c) a simulation-based model of top-down perceptual biasing. We describe a computational framework based on these principles, and its implementation on two physical robots. This could provide a new paradigm for embodied human-robot interaction based on recent psychological and neurological findings. Copyright © 2012 Cognitive Science Society, Inc.

  5. Executive control systems in the engineering design environment. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Hurst, P. W.

    1985-01-01

    An executive control system (ECS) is a software structure for unifying various applications codes into a comprehensive system. It provides a library of applications, a uniform access method through a cental user interface, and a data management facility. A survey of twenty-four executive control systems designed to unify various CAD/CAE applications for use in diverse engineering design environments within government and industry was conducted. The goals of this research were to establish system requirements to survey state-of-the-art architectural design approaches, and to provide an overview of the historical evolution of these systems. Foundations for design are presented and include environmental settings, system requirements, major architectural components, and a system classification scheme based on knowledge of the supported engineering domain(s). An overview of the design approaches used in developing the major architectural components of an ECS is presented with examples taken from the surveyed systems. Attention is drawn to four major areas of ECS development: interdisciplinary usage; standardization; knowledge utilization; and computer science technology transfer.

  6. Simulation studies on the standing and traveling wave thermoacoustic prime movers

    NASA Astrophysics Data System (ADS)

    Skaria, Mathew; Rasheed, K. K. Abdul; Shafi, K. A.; Kasthurirengan, S.; Behera, Upendra

    2014-01-01

    Thermoacoustic systems have been a focus of recent research due to its structural simplicity, high reliability due to absence of moving parts, and can be driven by low grade energy such as fuel, gas, solar energy, waste heat etc. There has been extensive research on both standing wave and traveling wave systems. Towards the development of such systems, simulations can be carried out by several methods such as (a) solving the energy equation, (b) enthalpy flow model, (c) DeltaEC, a free software available from LANL, USA (d) Computational Fluid Dynamics (CFD) etc. We present here the simulation studies of standing wave and traveling wave thermoacoustic prime movers using CFD and DeltaEC. The CFD analysis is carried out using Fluent 6.3.26, incorporating the necessary boundary conditions with different working fluids at different operating pressures. The results obtained by CFD are compared with those obtained using DeltaEC. Also, the CFD simulation of the thermoacoustically driven refrigerator is presented.

  7. Simulation studies on the standing and traveling wave thermoacoustic prime movers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skaria, Mathew; Rasheed, K. K. Abdul; Shafi, K. A.

    Thermoacoustic systems have been a focus of recent research due to its structural simplicity, high reliability due to absence of moving parts, and can be driven by low grade energy such as fuel, gas, solar energy, waste heat etc. There has been extensive research on both standing wave and traveling wave systems. Towards the development of such systems, simulations can be carried out by several methods such as (a) solving the energy equation, (b) enthalpy flow model, (c) DeltaEC, a free software available from LANL, USA (d) Computational Fluid Dynamics (CFD) etc. We present here the simulation studies of standingmore » wave and traveling wave thermoacoustic prime movers using CFD and DeltaEC. The CFD analysis is carried out using Fluent 6.3.26, incorporating the necessary boundary conditions with different working fluids at different operating pressures. The results obtained by CFD are compared with those obtained using DeltaEC. Also, the CFD simulation of the thermoacoustically driven refrigerator is presented.« less

  8. Marr's levels and the minimalist program.

    PubMed

    Johnson, Mark

    2017-02-01

    A simple change to a cognitive system at Marr's computational level may entail complex changes at the other levels of description of the system. The implementational level complexity of a change, rather than its computational level complexity, may be more closely related to the plausibility of a discrete evolutionary event causing that change. Thus the formal complexity of a change at the computational level may not be a good guide to the plausibility of an evolutionary event introducing that change. For example, while the Minimalist Program's Merge is a simple formal operation (Berwick & Chomsky, 2016), the computational mechanisms required to implement the language it generates (e.g., to parse the language) may be considerably more complex. This has implications for the theory of grammar: theories of grammar which involve several kinds of syntactic operations may be no less evolutionarily plausible than a theory of grammar that involves only one. A deeper understanding of human language at the algorithmic and implementational levels could strengthen Minimalist Program's account of the evolution of language.

  9. Soft computing approach to 3D lung nodule segmentation in CT.

    PubMed

    Badura, P; Pietka, E

    2014-10-01

    This paper presents a novel, multilevel approach to the segmentation of various types of pulmonary nodules in computed tomography studies. It is based on two branches of computational intelligence: the fuzzy connectedness (FC) and the evolutionary computation. First, the image and auxiliary data are prepared for the 3D FC analysis during the first stage of an algorithm - the masks generation. Its main goal is to process some specific types of nodules connected to the pleura or vessels. It consists of some basic image processing operations as well as dedicated routines for the specific cases of nodules. The evolutionary computation is performed on the image and seed points in order to shorten the FC analysis and improve its accuracy. After the FC application, the remaining vessels are removed during the postprocessing stage. The method has been validated using the first dataset of studies acquired and described by the Lung Image Database Consortium (LIDC) and by its latest release - the LIDC-IDRI (Image Database Resource Initiative) database. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. BioVLAB-MMIA: a cloud environment for microRNA and mRNA integrated analysis (MMIA) on Amazon EC2.

    PubMed

    Lee, Hyungro; Yang, Youngik; Chae, Heejoon; Nam, Seungyoon; Choi, Donghoon; Tangchaisin, Patanachai; Herath, Chathura; Marru, Suresh; Nephew, Kenneth P; Kim, Sun

    2012-09-01

    MicroRNAs, by regulating the expression of hundreds of target genes, play critical roles in developmental biology and the etiology of numerous diseases, including cancer. As a vast amount of microRNA expression profile data are now publicly available, the integration of microRNA expression data sets with gene expression profiles is a key research problem in life science research. However, the ability to conduct genome-wide microRNA-mRNA (gene) integration currently requires sophisticated, high-end informatics tools, significant expertise in bioinformatics and computer science to carry out the complex integration analysis. In addition, increased computing infrastructure capabilities are essential in order to accommodate large data sets. In this study, we have extended the BioVLAB cloud workbench to develop an environment for the integrated analysis of microRNA and mRNA expression data, named BioVLAB-MMIA. The workbench facilitates computations on the Amazon EC2 and S3 resources orchestrated by the XBaya Workflow Suite. The advantages of BioVLAB-MMIA over the web-based MMIA system include: 1) readily expanded as new computational tools become available; 2) easily modifiable by re-configuring graphic icons in the workflow; 3) on-demand cloud computing resources can be used on an "as needed" basis; 4) distributed orchestration supports complex and long running workflows asynchronously. We believe that BioVLAB-MMIA will be an easy-to-use computing environment for researchers who plan to perform genome-wide microRNA-mRNA (gene) integrated analysis tasks.

  11. Applications of genetic programming in cancer research.

    PubMed

    Worzel, William P; Yu, Jianjun; Almal, Arpit A; Chinnaiyan, Arul M

    2009-02-01

    The theory of Darwinian evolution is the fundamental keystones of modern biology. Late in the last century, computer scientists began adapting its principles, in particular natural selection, to complex computational challenges, leading to the emergence of evolutionary algorithms. The conceptual model of selective pressure and recombination in evolutionary algorithms allow scientists to efficiently search high dimensional space for solutions to complex problems. In the last decade, genetic programming has been developed and extensively applied for analysis of molecular data to classify cancer subtypes and characterize the mechanisms of cancer pathogenesis and development. This article reviews current successes using genetic programming and discusses its potential impact in cancer research and treatment in the near future.

  12. cDNA identification, comparison and phylogenetic aspects of lombricine kinase from two oligochaete species.

    PubMed

    Doumen, Chris

    2010-06-01

    Creatine kinase and arginine kinase are the typical representatives of an eight-member phosphagen kinase family, which play important roles in the cellular energy metabolism of animals. The phylum Annelida underwent a series of evolutionary processes that resulted in rapid divergence and radiation of these enzymes, producing the greatest diversity of the phosphagen kinases within this phylum. Lombricine kinase (EC 2.7.3.5) is one of such enzymes and sequence information is rather limited compared to other phosphagen kinases. This study presents data on the cDNA sequences of lombricine kinase from two oligochaete species, the California blackworm (Lumbriculus variegatus) and the sludge worm (Tubifex tubifex). The deduced amino acid sequences are analyzed and compared with other selected phosphagen kinases, including two additional lombricine kinase sequences extracted from DNA databases and provide further insights in the evolution and position of these enzymes within the phosphagen kinase family. The data confirms the presence of a deleted region within the flexible loop (the GS region) of all six examined lombricine kinases. A phylogenetic analysis of these six lombricine kinases clearly positions the enzymes together in a small subcluster within the larger creatine kinase (EC 2.7.3.2) clade. 2010. Published by Elsevier Inc.

  13. Robust electromagnetically guided endoscopic procedure using enhanced particle swarm optimization for multimodal information fusion.

    PubMed

    Luo, Xiongbiao; Wan, Ying; He, Xiangjian

    2015-04-01

    Electromagnetically guided endoscopic procedure, which aims at accurately and robustly localizing the endoscope, involves multimodal sensory information during interventions. However, it still remains challenging in how to integrate these information for precise and stable endoscopic guidance. To tackle such a challenge, this paper proposes a new framework on the basis of an enhanced particle swarm optimization method to effectively fuse these information for accurate and continuous endoscope localization. The authors use the particle swarm optimization method, which is one of stochastic evolutionary computation algorithms, to effectively fuse the multimodal information including preoperative information (i.e., computed tomography images) as a frame of reference, endoscopic camera videos, and positional sensor measurements (i.e., electromagnetic sensor outputs). Since the evolutionary computation method usually limits its possible premature convergence and evolutionary factors, the authors introduce the current (endoscopic camera and electromagnetic sensor's) observation to boost the particle swarm optimization and also adaptively update evolutionary parameters in accordance with spatial constraints and the current observation, resulting in advantageous performance in the enhanced algorithm. The experimental results demonstrate that the authors' proposed method provides a more accurate and robust endoscopic guidance framework than state-of-the-art methods. The average guidance accuracy of the authors' framework was about 3.0 mm and 5.6° while the previous methods show at least 3.9 mm and 7.0°. The average position and orientation smoothness of their method was 1.0 mm and 1.6°, which is significantly better than the other methods at least with (2.0 mm and 2.6°). Additionally, the average visual quality of the endoscopic guidance was improved to 0.29. A robust electromagnetically guided endoscopy framework was proposed on the basis of an enhanced particle swarm optimization method with using the current observation information and adaptive evolutionary factors. The authors proposed framework greatly reduced the guidance errors from (4.3, 7.8) to (3.0 mm, 5.6°), compared to state-of-the-art methods.

  14. Computability, Gödel's incompleteness theorem, and an inherent limit on the predictability of evolution

    PubMed Central

    Day, Troy

    2012-01-01

    The process of evolutionary diversification unfolds in a vast genotypic space of potential outcomes. During the past century, there have been remarkable advances in the development of theory for this diversification, and the theory's success rests, in part, on the scope of its applicability. A great deal of this theory focuses on a relatively small subset of the space of potential genotypes, chosen largely based on historical or contemporary patterns, and then predicts the evolutionary dynamics within this pre-defined set. To what extent can such an approach be pushed to a broader perspective that accounts for the potential open-endedness of evolutionary diversification? There have been a number of significant theoretical developments along these lines but the question of how far such theory can be pushed has not been addressed. Here a theorem is proven demonstrating that, because of the digital nature of inheritance, there are inherent limits on the kinds of questions that can be answered using such an approach. In particular, even in extremely simple evolutionary systems, a complete theory accounting for the potential open-endedness of evolution is unattainable unless evolution is progressive. The theorem is closely related to Gödel's incompleteness theorem, and to the halting problem from computability theory. PMID:21849390

  15. Method of recording bioelectrical signals using a capacitive coupling

    NASA Astrophysics Data System (ADS)

    Simon, V. A.; Gerasimov, V. A.; Kostrin, D. K.; Selivanov, L. M.; Uhov, A. A.

    2017-11-01

    In this article a technique for the bioelectrical signals acquisition by means of the capacitive sensors is described. A feedback loop for the ultra-high impedance biasing of the input instrumentation amplifier, which provides receiving of the electrical cardiac signal (ECS) through a capacitive coupling, is proposed. The mains 50/60 Hz noise is suppressed by a narrow-band stop filter with an independent notch frequency and quality factor tuning. Filter output is attached to a ΣΔ analog-to-digital converter (ADC), which acquires the filtered signal with a 24-bit resolution. Signal processing board is connected through universal serial bus interface to a personal computer, where ECS in a digital form is recorded and processed.

  16. Framework for computationally efficient optimal irrigation scheduling using ant colony optimization

    USDA-ARS?s Scientific Manuscript database

    A general optimization framework is introduced with the overall goal of reducing search space size and increasing the computational efficiency of evolutionary algorithm application for optimal irrigation scheduling. The framework achieves this goal by representing the problem in the form of a decisi...

  17. Pervasive Computing and Communication Technologies for U-Learning

    ERIC Educational Resources Information Center

    Park, Young C.

    2014-01-01

    The development of digital information transfer, storage and communication methods influences a significant effect on education. The assimilation of pervasive computing and communication technologies marks another great step forward, with Ubiquitous Learning (U-learning) emerging for next generation learners. In the evolutionary view the 5G (or…

  18. A comparative study of electric load curve changes in an urban low-voltage substation in Spain during the economic crisis (2008-2013).

    PubMed

    Lara-Santillán, Pedro M; Mendoza-Villena, Montserrat; Fernández-Jiménez, L Alfredo; Mañana-Canteli, Mario

    2014-01-01

    This paper presents a comparative study of the electricity consumption (EC) in an urban low-voltage substation before and during the economic crisis (2008-2013). This low-voltage substation supplies electric power to near 400 users. The EC was measured for an 11-year period (2002-2012) with a sampling time of 1 minute. The study described in the paper consists of detecting the changes produced in the load curves of this substation along the time due to changes in the behaviour of consumers. The EC was compared using representative curves per time period (precrisis and crisis). These representative curves were obtained after a computational process, which was based on a search for days with similar curves to the curve of a determined (base) date. This similitude was assessed by the proximity on the calendar, day of the week, daylight time, and outdoor temperature. The last selection parameter was the error between the nearest neighbour curves and the base date curve. The obtained representative curves were linearized to determine changes in their structure (maximum and minimum consumption values, duration of the daily time slot, etc.). The results primarily indicate an increase in the EC in the night slot during the summer months in the crisis period.

  19. Statistical mixture design and multivariate analysis of inkjet printed a-WO3/TiO2/WOX electrochromic films.

    PubMed

    Wojcik, Pawel Jerzy; Pereira, Luís; Martins, Rodrigo; Fortunato, Elvira

    2014-01-13

    An efficient mathematical strategy in the field of solution processed electrochromic (EC) films is outlined as a combination of an experimental work, modeling, and information extraction from massive computational data via statistical software. Design of Experiment (DOE) was used for statistical multivariate analysis and prediction of mixtures through a multiple regression model, as well as the optimization of a five-component sol-gel precursor subjected to complex constraints. This approach significantly reduces the number of experiments to be realized, from 162 in the full factorial (L=3) and 72 in the extreme vertices (D=2) approach down to only 30 runs, while still maintaining a high accuracy of the analysis. By carrying out a finite number of experiments, the empirical modeling in this study shows reasonably good prediction ability in terms of the overall EC performance. An optimized ink formulation was employed in a prototype of a passive EC matrix fabricated in order to test and trial this optically active material system together with a solid-state electrolyte for the prospective application in EC displays. Coupling of DOE with chromogenic material formulation shows the potential to maximize the capabilities of these systems and ensures increased productivity in many potential solution-processed electrochemical applications.

  20. Langley's CSI evolutionary model: Phase O

    NASA Technical Reports Server (NTRS)

    Belvin, W. Keith; Elliott, Kenny B.; Horta, Lucas G.; Bailey, Jim P.; Bruner, Anne M.; Sulla, Jeffrey L.; Won, John; Ugoletti, Roberto M.

    1991-01-01

    A testbed for the development of Controls Structures Interaction (CSI) technology to improve space science platform pointing is described. The evolutionary nature of the testbed will permit the study of global line-of-sight pointing in phases 0 and 1, whereas, multipayload pointing systems will be studied beginning with phase 2. The design, capabilities, and typical dynamic behavior of the phase 0 version of the CSI evolutionary model (CEM) is documented for investigator both internal and external to NASA. The model description includes line-of-sight pointing measurement, testbed structure, actuators, sensors, and real time computers, as well as finite element and state space models of major components.

  1. Efficiency of the neighbor-joining method in reconstructing deep and shallow evolutionary relationships in large phylogenies.

    PubMed

    Kumar, S; Gadagkar, S R

    2000-12-01

    The neighbor-joining (NJ) method is widely used in reconstructing large phylogenies because of its computational speed and the high accuracy in phylogenetic inference as revealed in computer simulation studies. However, most computer simulation studies have quantified the overall performance of the NJ method in terms of the percentage of branches inferred correctly or the percentage of replications in which the correct tree is recovered. We have examined other aspects of its performance, such as the relative efficiency in correctly reconstructing shallow (close to the external branches of the tree) and deep branches in large phylogenies; the contribution of zero-length branches to topological errors in the inferred trees; and the influence of increasing the tree size (number of sequences), evolutionary rate, and sequence length on the efficiency of the NJ method. Results show that the correct reconstruction of deep branches is no more difficult than that of shallower branches. The presence of zero-length branches in realized trees contributes significantly to the overall error observed in the NJ tree, especially in large phylogenies or slowly evolving genes. Furthermore, the tree size does not influence the efficiency of NJ in reconstructing shallow and deep branches in our simulation study, in which the evolutionary process is assumed to be homogeneous in all lineages.

  2. Spore: Spawning Evolutionary Misconceptions?

    NASA Astrophysics Data System (ADS)

    Bean, Thomas E.; Sinatra, Gale M.; Schrader, P. G.

    2010-10-01

    The use of computer simulations as educational tools may afford the means to develop understanding of evolution as a natural, emergent, and decentralized process. However, special consideration of developmental constraints on learning may be necessary when using these technologies. Specifically, the essentialist (biological forms possess an immutable essence), teleological (assignment of purpose to living things and/or parts of living things that may not be purposeful), and intentionality (assumption that events are caused by an intelligent agent) biases may be reinforced through the use of computer simulations, rather than addressed with instruction. We examine the video game Spore for its depiction of evolutionary content and its potential to reinforce these cognitive biases. In particular, we discuss three pedagogical strategies to mitigate weaknesses of Spore and other computer simulations: directly targeting misconceptions through refutational approaches, targeting specific principles of scientific inquiry, and directly addressing issues related to models as cognitive tools.

  3. Parallel Evolutionary Optimization for Neuromorphic Network Training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuman, Catherine D; Disney, Adam; Singh, Susheela

    One of the key impediments to the success of current neuromorphic computing architectures is the issue of how best to program them. Evolutionary optimization (EO) is one promising programming technique; in particular, its wide applicability makes it especially attractive for neuromorphic architectures, which can have many different characteristics. In this paper, we explore different facets of EO on a spiking neuromorphic computing model called DANNA. We focus on the performance of EO in the design of our DANNA simulator, and on how to structure EO on both multicore and massively parallel computing systems. We evaluate how our parallel methods impactmore » the performance of EO on Titan, the U.S.'s largest open science supercomputer, and BOB, a Beowulf-style cluster of Raspberry Pi's. We also focus on how to improve the EO by evaluating commonality in higher performing neural networks, and present the result of a study that evaluates the EO performed by Titan.« less

  4. A computer lab exploring evolutionary aspects of chromatin structure and dynamics for an undergraduate chromatin course*.

    PubMed

    Eirín-López, José M

    2013-01-01

    The study of chromatin constitutes one of the most active research fields in life sciences, being subject to constant revisions that continuously redefine the state of the art in its knowledge. As every other rapidly changing field, chromatin biology requires clear and straightforward educational strategies able to efficiently translate such a vast body of knowledge to the classroom. With this aim, the present work describes a multidisciplinary computer lab designed to introduce undergraduate students to the dynamic nature of chromatin, within the context of the one semester course "Chromatin: Structure, Function and Evolution." This exercise is organized in three parts including (a) molecular evolutionary biology of histone families (using the H1 family as example), (b) histone structure and variation across different animal groups, and (c) effect of histone diversity on nucleosome structure and chromatin dynamics. By using freely available bioinformatic tools that can be run on common computers, the concept of chromatin dynamics is interactively illustrated from a comparative/evolutionary perspective. At the end of this computer lab, students are able to translate the bioinformatic information into a biochemical context in which the relevance of histone primary structure on chromatin dynamics is exposed. During the last 8 years this exercise has proven to be a powerful approach for teaching chromatin structure and dynamics, allowing students a higher degree of independence during the processes of learning and self-assessment. Copyright © 2013 International Union of Biochemistry and Molecular Biology, Inc.

  5. The Endocannabinoid System in the Baboon (Papio SPP.) as a Complex Framework for Developmental Pharmacology

    PubMed Central

    Rodriguez-Sanchez, Iram P.; Guindon, Josee; Ruiz, Marco; Tejero, Maria E.; Hubbard, Gene; Martinez-De-Villarreal, Laura E.; Barrera-Saldaña, Hugo A.; Dick, Edward J.; Commuzzie, Anthony G; Schlabritz-Loutsevitch, Natalia E

    2017-01-01

    Introduction The consumption of marijuana (exogenous cannabinoid) almost doubled in adults during last decade. Consumption of exogenous cannabinoids interferes with the endogenous cannabinoid (or “endocannabinoid” (eCB)) system (ECS), which comprises N-arachidonylethanolamide (anandamide, AEA), 2-arachidonoyl glycerol (2-AG), endocannabinoid receptors (cannabinoid receptors 1 and 2 (CB1R and CB2R), encoded by CNR1 and CNR2, respectively), and synthesizing/degrading enzymes (FAAH, fatty-acid amide hydrolase; MAGL, monoacylglycerol lipase; DAGL-α, diacylglycerol lipase-alpha). Reports regarding the toxic and therapeutic effects of pharmacological compounds targeting the ECS are sometimes contradictory. This may be caused by the fact that structure of the eCBs varies in the species studied. Objectives First: to clone and characterize the cDNAs of selected members of ECS in a non-human primate (baboon, Papio spp.), and second: to compare those cDNA sequences to known human structural variants (single nucleotide polymorphisms and haplotypes). Materials and methods Polymerase chain reaction-amplified gene products from baboon tissues were transformed into Escherichia coli. Amplicon-positive clones were sequenced, and the obtained sequences were conceptually translated into amino-acid sequences using the genetic code. Results Among the ECS members, CNR1 was the best conserved gene between humans and baboons. The phenotypes associated with mutations in the untranslated regions of this gene in humans have not been described in baboons. One difference in the structure of CNR2 between humans and baboons was detected in the region with the only known clinically relevant polymorphism in a human receptor. All of the differences in the amino-acid structure of DAGL-α between humans and baboons were located in the hydroxylase domain, close to phosphorylation sites. None of the differences in the amino-acid structure of MAGL observed between baboons and humans were located in the area critical for enzyme function. Conclusion The evaluation of the data, obtained in non-human primate model of cannabis-related developmental exposure should take into consideration possible evolutionary-determined species-specific differences in the CB1R expression, CB2R transduction pathway, and FAAH and DAGLα substrate-enzyme interactions. PMID:27327781

  6. The endocannabinoid system in the baboon (Papio spp.) as a complex framework for developmental pharmacology.

    PubMed

    Rodriguez-Sanchez, Iram P; Guindon, Josee; Ruiz, Marco; Tejero, M Elizabeth; Hubbard, Gene; Martinez-de-Villarreal, Laura E; Barrera-Saldaña, Hugo A; Dick, Edward J; Comuzzie, Anthony G; Schlabritz-Loutsevitch, Natalia E

    The consumption of marijuana (exogenous cannabinoid) almost doubled in adults during last decade. Consumption of exogenous cannabinoids interferes with the endogenous cannabinoid (or "endocannabinoid" (eCB)) system (ECS), which comprises N-arachidonylethanolamide (anandamide, AEA), 2-arachidonoyl glycerol (2-AG), endocannabinoid receptors (cannabinoid receptors 1 and 2 (CB1R and CB2R), encoded by CNR1 and CNR2, respectively), and synthesizing/degrading enzymes (FAAH, fatty-acid amide hydrolase; MAGL, monoacylglycerol lipase; DAGL-α, diacylglycerol lipase-alpha). Reports regarding the toxic and therapeutic effects of pharmacological compounds targeting the ECS are sometimes contradictory. This may be caused by the fact that structure of the eCBs varies in the species studied. First: to clone and characterize the cDNAs of selected members of ECS in a non-human primate (baboon, Papio spp.), and second: to compare those cDNA sequences to known human structural variants (single nucleotide polymorphisms and haplotypes). Polymerase chain reaction-amplified gene products from baboon tissues were transformed into Escherichia coli. Amplicon-positive clones were sequenced, and the obtained sequences were conceptually translated into amino-acid sequences using the genetic code. Among the ECS members, CNR1 was the best conserved gene between humans and baboons. The phenotypes associated with mutations in the untranslated regions of this gene in humans have not been described in baboons. One difference in the structure of CNR2 between humans and baboons was detected in the region with the only known clinically relevant polymorphism in a human receptor. All of the differences in the amino-acid structure of DAGL-α between humans and baboons were located in the hydroxylase domain, close to phosphorylation sites. None of the differences in the amino-acid structure of MAGL observed between baboons and humans were located in the area critical for enzyme function. The evaluation of the data, obtained in non-human primate model of cannabis-related developmental exposure should take into consideration possible evolutionary-determined species-specific differences in the CB1R expression, CB2R transduction pathway, and FAAH and DAGLα substrate-enzyme interactions. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. JPRS Report, Science & Technology, Europe, EC Commissioners Evaluate Electronics, Computer Industries

    DTIC Science & Technology

    1991-06-14

    American firms of foreign origin as regards R&TD is being practised by the Department of Defense, and Sematech is one example here. As negotiations... taxation measures. Training schemes for staff in the banking sector encom- passing both the financial side and computerized sys- tems applications

  8. Scaling Atomic Partial Charges of Carbonate Solvents for Lithium Ion Solvation and Diffusion

    DOE PAGES

    Chaudhari, Mangesh I.; Nair, Jijeesh R.; Pratt, Lawrence R.; ...

    2016-10-21

    Lithium-ion solvation and diffusion properties in ethylene carbonate (EC) and propylene carbonate (PC) were studied by molecular simulation, experiments, and electronic structure calculations. Studies carried out in water provide a reference for interpretation. Classical molecular dynamics simulation results are compared to ab initio molecular dynamics to assess nonpolarizable force field parameters for solvation structure of the carbonate solvents. Quasi-chemical theory (QCT) was adapted to take advantage of fourfold occupancy of the near-neighbor solvation structure observed in simulations and used to calculate solvation free energies. The computed free energy for transfer of Li + to PC from water, based on electronicmore » structure calculations with cluster-QCT, agrees with the experimental value. The simulation-based direct-QCT results with scaled partial charges agree with the electronic structure-based QCT values. The computed Li +/PF 6 - transference numbers of 0.35/0.65 (EC) and 0.31/0.69 (PC) agree well with NMR experimental values of 0.31/0.69 (EC) and 0.34/0.66 (PC) and similar values obtained here with impedance spectroscopy. These combined results demonstrate that solvent partial charges can be scaled in systems dominated by strong electrostatic interactions to achieve trends in ion solvation and transport properties that are comparable to ab initio and experimental results. Thus, the results support the use of scaled partial charges in simple, nonpolarizable force fields in future studies of these electrolyte solutions.« less

  9. A Connectionist Approach to Embodied Conceptual Metaphor

    PubMed Central

    Flusberg, Stephen J.; Thibodeau, Paul H.; Sternberg, Daniel A.; Glick, Jeremy J.

    2010-01-01

    A growing body of data has been gathered in support of the view that the mind is embodied and that cognition is grounded in sensory-motor processes. Some researchers have gone so far as to claim that this paradigm poses a serious challenge to central tenets of cognitive science, including the widely held view that the mind can be analyzed in terms of abstract computational principles. On the other hand, computational approaches to the study of mind have led to the development of specific models that help researchers understand complex cognitive processes at a level of detail that theories of embodied cognition (EC) have sometimes lacked. Here we make the case that connectionist architectures in particular can illuminate many surprising results from the EC literature. These models can learn the statistical structure in their environments, providing an ideal framework for understanding how simple sensory-motor mechanisms could give rise to higher-level cognitive behavior over the course of learning. Crucially, they form overlapping, distributed representations, which have exactly the properties required by many embodied accounts of cognition. We illustrate this idea by extending an existing connectionist model of semantic cognition in order to simulate findings from the embodied conceptual metaphor literature. Specifically, we explore how the abstract domain of time may be structured by concrete experience with space (including experience with culturally specific spatial and linguistic cues). We suggest that both EC researchers and connectionist modelers can benefit from an integrated approach to understanding these models and the empirical findings they seek to explain. PMID:21833256

  10. Where Tori Fear to Tread: Hypermassive Neutron Star Remnants and Absolute Event Horizons or Topics in Computational General Relativity

    NASA Astrophysics Data System (ADS)

    Kaplan, Jeffrey Daniel

    2014-01-01

    Computational general relativity is a field of study which has reached maturity only within the last decade. This thesis details several studies that elucidate phenomena related to the coalescence of compact object binaries. Chapters 2 and 3 recounts work towards developing new analytical tools for visualizing and reasoning about dynamics in strongly curved spacetimes. In both studies, the results employ analogies with the classical theory of electricity and magnetism, first (Ch. 2) in the post-Newtonian approximation to general relativity and then (Ch. 3) in full general relativity though in the absence of matter sources. In Chapter 4, we examine the topological structure of absolute event horizons during binary black hole merger simulations conducted with the SpEC code. Chapter 6 reports on the progress of the SpEC code in simulating the coalescence of neutron star-neutron star binaries, while Chapter 7 tests the effects of various numerical gauge conditions on the robustness of black hole formation from stellar collapse in SpEC. In Chapter 5, we examine the nature of pseudospectral expansions of non-smooth functions motivated by the need to simulate the stellar surface in Chapters 6 and 7. In Chapter 8, we study how thermal effects in the nuclear equation of state effect the equilibria and stability of hypermassive neutron stars. Chapter 9 presents supplements to the work in Chapter 8, including an examination of the stability question raised in Chapter 8 in greater mathematical detail.

  11. Inquiry-Based Learning of Molecular Phylogenetics

    ERIC Educational Resources Information Center

    Campo, Daniel; Garcia-Vazquez, Eva

    2008-01-01

    Reconstructing phylogenies from nucleotide sequences is a challenge for students because it strongly depends on evolutionary models and computer tools that are frequently updated. We present here an inquiry-based course aimed at learning how to trace a phylogeny based on sequences existing in public databases. Computer tools are freely available…

  12. δ-Conotoxin SuVIA suggests an evolutionary link between ancestral predator defence and the origin of fish-hunting behaviour in carnivorous cone snails.

    PubMed

    Jin, Ai-Hua; Israel, Mathilde R; Inserra, Marco C; Smith, Jennifer J; Lewis, Richard J; Alewood, Paul F; Vetter, Irina; Dutertre, Sébastien

    2015-07-22

    Some venomous cone snails feed on small fishes using an immobilizing combination of synergistic venom peptides that target Kv and Nav channels. As part of this envenomation strategy, δ-conotoxins are potent ichtyotoxins that enhance Nav channel function. δ-Conotoxins belong to an ancient and widely distributed gene superfamily, but any evolutionary link from ancestral worm-eating cone snails to modern piscivorous species has not been elucidated. Here, we report the discovery of SuVIA, a potent vertebrate-active δ-conotoxin characterized from a vermivorous cone snail (Conus suturatus). SuVIA is equipotent at hNaV1.3, hNaV1.4 and hNaV1.6 with EC50s in the low nanomolar range. SuVIA also increased peak hNaV1.7 current by approximately 75% and shifted the voltage-dependence of activation to more hyperpolarized potentials from -15 mV to -25 mV, with little effect on the voltage-dependence of inactivation. Interestingly, the proximal venom gland expression and pain-inducing effect of SuVIA in mammals suggest that δ-conotoxins in vermivorous cone snails play a defensive role against higher order vertebrates. We propose that δ-conotoxins originally evolved in ancestral vermivorous cones to defend against larger predators including fishes have been repurposed to facilitate a shift to piscivorous behaviour, suggesting an unexpected underlying mechanism for this remarkable evolutionary transition. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  13. Suitability of energy cone for probabilistic volcanic hazard assessment: validation tests at Somma-Vesuvius and Campi Flegrei (Italy)

    NASA Astrophysics Data System (ADS)

    Tierz, Pablo; Sandri, Laura; Costa, Antonio; Zaccarelli, Lucia; Di Vito, Mauro Antonio; Sulpizio, Roberto; Marzocchi, Warner

    2016-11-01

    Pyroclastic density currents (PDCs) are gravity-driven hot mixtures of gas and volcanic particles which can propagate at high speed and cover distances up to several tens of kilometers around a given volcano. Therefore, they pose a severe hazard to the surroundings of explosive volcanoes able to produce such phenomena. Despite this threat, probabilistic volcanic hazard assessment (PVHA) of PDCs is still in an early stage of development. PVHA is rooted in the quantification of the large uncertainties (aleatory and epistemic) which characterize volcanic hazard analyses. This quantification typically requires a big dataset of hazard footprints obtained from numerical simulations of the physical process. For PDCs, numerical models range from very sophisticated (not useful for PVHA because of their very long runtimes) to very simple models (criticized because of their highly simplified physics). We present here a systematic and robust validation testing of a simple PDC model, the energy cone (EC), to unravel whether it can be applied to PVHA of PDCs. Using past PDC deposits at Somma-Vesuvius and Campi Flegrei (Italy), we assess the ability of EC to capture the values and variability in some relevant variables for hazard assessment, i.e., area of PDC invasion and maximum runout. In terms of area of invasion, the highest Jaccard coefficients range from 0.33 to 0.86 which indicates an equal or better performance compared to other volcanic mass-flow models. The p values for the observed maximum runouts vary from 0.003 to 0.44. Finally, the frequencies of PDC arrival computed from the EC are similar to those determined from the spatial distribution of past PDC deposits, with high PDC-arrival frequencies over an ˜8-km radius from the crater area at Somma-Vesuvius and around the Astroni crater at Campi Flegrei. The insights derived from our validation tests seem to indicate that the EC is a suitable candidate to compute PVHA of PDCs.

  14. International Conference on Artificial Immune Systems (1st) ICARIS 2002, held on 9, 10, and 11 September 2002

    DTIC Science & Technology

    2002-03-07

    Michalewicz, Eds., Evolutionary Computation 1: Basic Algorithms and Operators, Institute of Physics, Bristol (UK), 2000. [3] David A. Van Veldhuizen ...2000. [4] Carlos A. Coello Coello, David A. Van Veldhuizen , and Gary B. Lamont, Evolutionary Algorithms for Solving Multi-Objective Problems, Kluwer...Academic Publishers, 233 Spring St., New York, NY 10013, 2002. [5] David A. Van Veldhuizen , Multiobjective Evolution- ary Algorithms: Classifications

  15. Using parallel evolutionary development for a biologically-inspired computer vision system for mobile robots.

    PubMed

    Wright, Cameron H G; Barrett, Steven F; Pack, Daniel J

    2005-01-01

    We describe a new approach to attacking the problem of robust computer vision for mobile robots. The overall strategy is to mimic the biological evolution of animal vision systems. Our basic imaging sensor is based upon the eye of the common house fly, Musca domestica. The computational algorithms are a mix of traditional image processing, subspace techniques, and multilayer neural networks.

  16. Dayton Aircraft Cabin Fire Model, Version 3, Volume I. Physical Description.

    DTIC Science & Technology

    1982-06-01

    contact to any surface directly above a burning element, provided that the current flame length makes contact possible. For fires originating on the...no extension of the flames horizontally beneath the surface is considered. The equation for computing the flame length is presented in Section 5. For...high as 0.3. The values chosen for DACFIR3 are 0.15 for Ec and 0.10 for E P. The Steward model is also used to compute flame length , hf, for the fire

  17. Automated Antenna Design with Evolutionary Algorithms

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.; Globus, Al; Linden, Derek S.; Lohn, Jason D.

    2006-01-01

    Current methods of designing and optimizing antennas by hand are time and labor intensive, and limit complexity. Evolutionary design techniques can overcome these limitations by searching the design space and automatically finding effective solutions. In recent years, evolutionary algorithms have shown great promise in finding practical solutions in large, poorly understood design spaces. In particular, spacecraft antenna design has proven tractable to evolutionary design techniques. Researchers have been investigating evolutionary antenna design and optimization since the early 1990s, and the field has grown in recent years as computer speed has increased and electromagnetic simulators have improved. Two requirements-compliant antennas, one for ST5 and another for TDRS-C, have been automatically designed by evolutionary algorithms. The ST5 antenna is slated to fly this year, and a TDRS-C phased array element has been fabricated and tested. Such automated evolutionary design is enabled by medium-to-high quality simulators and fast modern computers to evaluate computer-generated designs. Evolutionary algorithms automate cut-and-try engineering, substituting automated search though millions of potential designs for intelligent search by engineers through a much smaller number of designs. For evolutionary design, the engineer chooses the evolutionary technique, parameters and the basic form of the antenna, e.g., single wire for ST5 and crossed-element Yagi for TDRS-C. Evolutionary algorithms then search for optimal configurations in the space defined by the engineer. NASA's Space Technology 5 (ST5) mission will launch three small spacecraft to test innovative concepts and technologies. Advanced evolutionary algorithms were used to automatically design antennas for ST5. The combination of wide beamwidth for a circularly-polarized wave and wide impedance bandwidth made for a challenging antenna design problem. From past experience in designing wire antennas, we chose to constrain the evolutionary design to a monopole wire antenna. The results of the runs produced requirements-compliant antennas that were subsequently fabricated and tested. The evolved antenna has a number of advantages with regard to power consumption, fabrication time and complexity, and performance. Lower power requirements result from achieving high gain across a wider range of elevation angles, thus allowing a broader range of angles over which maximum data throughput can be achieved. Since the evolved antenna does not require a phasing circuit, less design and fabrication work is required. In terms of overall work, the evolved antenna required approximately three person-months to design and fabricate whereas the conventional antenna required about five. Furthermore, when the mission was modified and new orbital parameters selected, a redesign of the antenna to new requirements was required. The evolutionary system was rapidly modified and a new antenna evolved in a few weeks. The evolved antenna was shown to be compliant to the ST5 mission requirements. It has an unusual organic looking structure, one that expert antenna designers would not likely produce. This antenna has been tested, baselined and is scheduled to fly this year. In addition to the ST5 antenna, our laboratory has evolved an S-band phased array antenna element design that meets the requirements for NASA's TDRS-C communications satellite scheduled for launch early next decade. A combination of fairly broad bandwidth, high efficiency and circular polarization at high gain made for another challenging design problem. We chose to constrain the evolutionary design to a crossed-element Yagi antenna. The specification called for two types of elements, one for receive only and one for transmit/receive. We were able to evolve a single element design that meets both specifications thereby simplifying the antenna and reducing testing and integration costs. The highest performance antenna found using a getic algorithm and stochastic hill-climbing has been fabricated and tested. Laboratory results correspond well with simulation. Aerospace component design is an expensive and important step in space development. Evolutionary design can make a significant contribution wherever sufficiently fast, accurate and capable software simulators are available. We have demonstrated successful real-world design in the spacecraft antenna domain; and there is good reason to believe that these results could be replicated in other design spaces.

  18. Cooking "shrimp à la créole": a pilot study of an ecological rehabilitation in semantic dementia.

    PubMed

    Bier, Nathalie; Macoir, Joël; Joubert, Sven; Bottari, Carolina; Chayer, Céline; Pigot, Hélène; Giroux, Sylvain

    2011-08-01

    New learning in semantic dementia (SD) seems to be tied to a specific temporal and spatial context. Thus, cognitive rehabilitation could capitalise upon preserved episodic memory and focus on everyday activities which, once learned, will have an impact in everyday life. This pilot study thus explores the effectiveness of an ecological approach in one patient suffering from SD. EC, a 68-year-old woman with SD, stopped cooking complex meals due to a substantial loss of knowledge related to all food types. The therapy consisted of preparing a target recipe. She was asked to generate semantic attributes of ingredients found in one target, one control and two no-therapy recipes. The number of recipes cooked by EC between therapy sessions was computed. She was also asked to prepare a generalisation recipe combining ingredients from the target and control recipes. EC's generated semantic attributes (GSA) of ingredients pertaining to the target and control recipes increased significantly (p < .001), compared to the no-therapy recipes (ps > .79). The proportion of meals cooked also increased significantly (p = .021). For the generalisation recipe, she could not succeed without assistance. Frequent food preparation may have provided EC with new memories about the context, usage and appearance of some concepts. These memories seem very context-bound, but EC nonetheless re-introduced some recipes into her day-to-day life. The impact of these results on the relationship between semantic, episodic and procedural memory is discussed, as well as the relevance of an ecological approach in SD.

  19. Trade-offs with stability modulate innate and mutationally acquired drug-resistance in bacterial dihydrofolate reductase enzymes.

    PubMed

    Matange, Nishad; Bodkhe, Swapnil; Patel, Maitri; Shah, Pooja

    2018-06-05

    Structural stability is a major constraint on the evolution of protein sequences. However, under strong directional selection, mutations that confer novel phenotypes but compromise structural stability of proteins may be permissible. During the evolution of antibiotic resistance, mutations that confer drug resistance often have pleiotropic effects on the structure and function of antibiotic-target proteins, usually essential metabolic enzymes. In this study, we show that trimethoprim-resistant alleles of dihydrofolate reductase from Escherichia coli (EcDHFR) harbouring the Trp30Gly, Trp30Arg or Trp30Cys mutations are significantly less stable than the wild type making them prone to aggregation and proteolysis. This destabilization is associated with lower expression level resulting in a fitness cost and negative epistasis with other TMP-resistant mutations in EcDHFR. Using structure-based mutational analysis we show that perturbation of critical stabilizing hydrophobic interactions in wild type EcDHFR enzyme explains the phenotypes of Trp30 mutants. Surprisingly, though crucial for the stability of EcDHFR, significant sequence variation is found at this site among bacterial DHFRs. Mutational and computational analyses in EcDHFR as well as in DHFR enzymes from Staphylococcus aureus and Mycobacterium tuberculosis demonstrate that natural variation at this site and its interacting hydrophobic residues, modulates TMP-resistance in other bacterial DHFRs as well, and may explain the different susceptibilities of bacterial pathogens to trimethoprim. Our study demonstrates that trade-offs between structural stability and function can influence innate drug resistance as well as the potential for mutationally acquired drug resistance of an enzyme. ©2018 The Author(s).

  20. Hybrid Genetic Agorithms and Line Search Method for Industrial Production Planning with Non-Linear Fitness Function

    NASA Astrophysics Data System (ADS)

    Vasant, Pandian; Barsoum, Nader

    2008-10-01

    Many engineering, science, information technology and management optimization problems can be considered as non linear programming real world problems where the all or some of the parameters and variables involved are uncertain in nature. These can only be quantified using intelligent computational techniques such as evolutionary computation and fuzzy logic. The main objective of this research paper is to solve non linear fuzzy optimization problem where the technological coefficient in the constraints involved are fuzzy numbers which was represented by logistic membership functions by using hybrid evolutionary optimization approach. To explore the applicability of the present study a numerical example is considered to determine the production planning for the decision variables and profit of the company.

  1. Hardware platforms for MEMS gyroscope tuning based on evolutionary computation using open-loop and closed -loop frequency response

    NASA Technical Reports Server (NTRS)

    Keymeulen, Didier; Ferguson, Michael I.; Fink, Wolfgang; Oks, Boris; Peay, Chris; Terrile, Richard; Cheng, Yen; Kim, Dennis; MacDonald, Eric; Foor, David

    2005-01-01

    We propose a tuning method for MEMS gyroscopes based on evolutionary computation to efficiently increase the sensitivity of MEMS gyroscopes through tuning. The tuning method was tested for the second generation JPL/Boeing Post-resonator MEMS gyroscope using the measurement of the frequency response of the MEMS device in open-loop operation. We also report on the development of a hardware platform for integrated tuning and closed loop operation of MEMS gyroscopes. The control of this device is implemented through a digital design on a Field Programmable Gate Array (FPGA). The hardware platform easily transitions to an embedded solution that allows for the miniaturization of the system to a single chip.

  2. EU Funded Research Activities on NPPS Operational Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manolatos, P.; Van Goethem, G.

    2002-07-01

    The 5. framework programme (FP-5), the pluri-annual research programme of the European Union (EU), covers the period 1998-2002. Research on nuclear energy, fusion and fission, is covered by the EURATOM part of the FP-5. An overview of the Euratom's research on Nuclear Reactor Safety, managed by the DG-RTD of the European Commission (EC), is presented. This concerns 70 multi-partner projects of approximately euro 82.5 million total contract value that have been selected and co-financed during the period 1999-2001. They form the three clusters of projects dealing with the 'Operational Safety of Existing Installations'. 'Plant Life Extension and Management' (PLEM), 'Severemore » Accident Management' (SAM) and 'Evolutionary concepts' (EVOL). Emphasis is given here to the projects of the PLEM cluster. (authors)« less

  3. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services

    PubMed Central

    Castaño-Díez, Daniel

    2017-01-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance. PMID:28580909

  4. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services.

    PubMed

    Castaño-Díez, Daniel

    2017-06-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance.

  5. Evolution of Collective Behaviour in an Artificial World Using Linguistic Fuzzy Rule-Based Systems

    PubMed Central

    Lebar Bajec, Iztok

    2017-01-01

    Collective behaviour is a fascinating and easily observable phenomenon, attractive to a wide range of researchers. In biology, computational models have been extensively used to investigate various properties of collective behaviour, such as: transfer of information across the group, benefits of grouping (defence against predation, foraging), group decision-making process, and group behaviour types. The question ‘why,’ however remains largely unanswered. Here the interest goes into which pressures led to the evolution of such behaviour, and evolutionary computational models have already been used to test various biological hypotheses. Most of these models use genetic algorithms to tune the parameters of previously presented non-evolutionary models, but very few attempt to evolve collective behaviour from scratch. Of these last, the successful attempts display clumping or swarming behaviour. Empirical evidence suggests that in fish schools there exist three classes of behaviour; swarming, milling and polarized. In this paper we present a novel, artificial life-like evolutionary model, where individual agents are governed by linguistic fuzzy rule-based systems, which is capable of evolving all three classes of behaviour. PMID:28045964

  6. An improved shuffled frog leaping algorithm based evolutionary framework for currency exchange rate prediction

    NASA Astrophysics Data System (ADS)

    Dash, Rajashree

    2017-11-01

    Forecasting purchasing power of one currency with respect to another currency is always an interesting topic in the field of financial time series prediction. Despite the existence of several traditional and computational models for currency exchange rate forecasting, there is always a need for developing simpler and more efficient model, which will produce better prediction capability. In this paper, an evolutionary framework is proposed by using an improved shuffled frog leaping (ISFL) algorithm with a computationally efficient functional link artificial neural network (CEFLANN) for prediction of currency exchange rate. The model is validated by observing the monthly prediction measures obtained for three currency exchange data sets such as USD/CAD, USD/CHF, and USD/JPY accumulated within same period of time. The model performance is also compared with two other evolutionary learning techniques such as Shuffled frog leaping algorithm and Particle Swarm optimization algorithm. Practical analysis of results suggest that, the proposed model developed using the ISFL algorithm with CEFLANN network is a promising predictor model for currency exchange rate prediction compared to other models included in the study.

  7. Evolution of Collective Behaviour in an Artificial World Using Linguistic Fuzzy Rule-Based Systems.

    PubMed

    Demšar, Jure; Lebar Bajec, Iztok

    2017-01-01

    Collective behaviour is a fascinating and easily observable phenomenon, attractive to a wide range of researchers. In biology, computational models have been extensively used to investigate various properties of collective behaviour, such as: transfer of information across the group, benefits of grouping (defence against predation, foraging), group decision-making process, and group behaviour types. The question 'why,' however remains largely unanswered. Here the interest goes into which pressures led to the evolution of such behaviour, and evolutionary computational models have already been used to test various biological hypotheses. Most of these models use genetic algorithms to tune the parameters of previously presented non-evolutionary models, but very few attempt to evolve collective behaviour from scratch. Of these last, the successful attempts display clumping or swarming behaviour. Empirical evidence suggests that in fish schools there exist three classes of behaviour; swarming, milling and polarized. In this paper we present a novel, artificial life-like evolutionary model, where individual agents are governed by linguistic fuzzy rule-based systems, which is capable of evolving all three classes of behaviour.

  8. Life's attractors : understanding developmental systems through reverse engineering and in silico evolution.

    PubMed

    Jaeger, Johannes; Crombach, Anton

    2012-01-01

    We propose an approach to evolutionary systems biology which is based on reverse engineering of gene regulatory networks and in silico evolutionary simulations. We infer regulatory parameters for gene networks by fitting computational models to quantitative expression data. This allows us to characterize the regulatory structure and dynamical repertoire of evolving gene regulatory networks with a reasonable amount of experimental and computational effort. We use the resulting network models to identify those regulatory interactions that are conserved, and those that have diverged between different species. Moreover, we use the models obtained by data fitting as starting points for simulations of evolutionary transitions between species. These simulations enable us to investigate whether such transitions are random, or whether they show stereotypical series of regulatory changes which depend on the structure and dynamical repertoire of an evolving network. Finally, we present a case study-the gap gene network in dipterans (flies, midges, and mosquitoes)-to illustrate the practical application of the proposed methodology, and to highlight the kind of biological insights that can be gained by this approach.

  9. An Improved Co-evolutionary Particle Swarm Optimization for Wireless Sensor Networks with Dynamic Deployment

    PubMed Central

    Wang, Xue; Wang, Sheng; Ma, Jun-Jie

    2007-01-01

    The effectiveness of wireless sensor networks (WSNs) depends on the coverage and target detection probability provided by dynamic deployment, which is usually supported by the virtual force (VF) algorithm. However, in the VF algorithm, the virtual force exerted by stationary sensor nodes will hinder the movement of mobile sensor nodes. Particle swarm optimization (PSO) is introduced as another dynamic deployment algorithm, but in this case the computation time required is the big bottleneck. This paper proposes a dynamic deployment algorithm which is named “virtual force directed co-evolutionary particle swarm optimization” (VFCPSO), since this algorithm combines the co-evolutionary particle swarm optimization (CPSO) with the VF algorithm, whereby the CPSO uses multiple swarms to optimize different components of the solution vectors for dynamic deployment cooperatively and the velocity of each particle is updated according to not only the historical local and global optimal solutions, but also the virtual forces of sensor nodes. Simulation results demonstrate that the proposed VFCPSO is competent for dynamic deployment in WSNs and has better performance with respect to computation time and effectiveness than the VF, PSO and VFPSO algorithms.

  10. The tangled bank of amino acids

    PubMed Central

    Pollock, David D.

    2016-01-01

    Abstract The use of amino acid substitution matrices to model protein evolution has yielded important insights into both the evolutionary process and the properties of specific protein families. In order to make these models tractable, standard substitution matrices represent the average results of the evolutionary process rather than the underlying molecular biophysics and population genetics, treating proteins as a set of independently evolving sites rather than as an integrated biomolecular entity. With advances in computing and the increasing availability of sequence data, we now have an opportunity to move beyond current substitution matrices to more interpretable mechanistic models with greater fidelity to the evolutionary process of mutation and selection and the holistic nature of the selective constraints. As part of this endeavour, we consider how epistatic interactions induce spatial and temporal rate heterogeneity, and demonstrate how these generally ignored factors can reconcile standard substitution rate matrices and the underlying biology, allowing us to better understand the meaning of these substitution rates. Using computational simulations of protein evolution, we can demonstrate the importance of both spatial and temporal heterogeneity in modelling protein evolution. PMID:27028523

  11. Evolutionary Optimization of a Geometrically Refined Truss

    NASA Technical Reports Server (NTRS)

    Hull, P. V.; Tinker, M. L.; Dozier, G. V.

    2007-01-01

    Structural optimization is a field of research that has experienced noteworthy growth for many years. Researchers in this area have developed optimization tools to successfully design and model structures, typically minimizing mass while maintaining certain deflection and stress constraints. Numerous optimization studies have been performed to minimize mass, deflection, and stress on a benchmark cantilever truss problem. Predominantly traditional optimization theory is applied to this problem. The cross-sectional area of each member is optimized to minimize the aforementioned objectives. This Technical Publication (TP) presents a structural optimization technique that has been previously applied to compliant mechanism design. This technique demonstrates a method that combines topology optimization, geometric refinement, finite element analysis, and two forms of evolutionary computation: genetic algorithms and differential evolution to successfully optimize a benchmark structural optimization problem. A nontraditional solution to the benchmark problem is presented in this TP, specifically a geometrically refined topological solution. The design process begins with an alternate control mesh formulation, multilevel geometric smoothing operation, and an elastostatic structural analysis. The design process is wrapped in an evolutionary computing optimization toolset.

  12. On the numerical treatment of selected oscillatory evolutionary problems

    NASA Astrophysics Data System (ADS)

    Cardone, Angelamaria; Conte, Dajana; D'Ambrosio, Raffaele; Paternoster, Beatrice

    2017-07-01

    We focus on evolutionary problems whose qualitative behaviour is known a-priori and exploited in order to provide efficient and accurate numerical schemes. For classical numerical methods, depending on constant coefficients, the required computational effort could be quite heavy, due to the necessary employ of very small stepsizes needed to accurately reproduce the qualitative behaviour of the solution. In these situations, it may be convenient to use special purpose formulae, i.e. non-polynomially fitted formulae on basis functions adapted to the problem (see [16, 17] and references therein). We show examples of special purpose strategies to solve two families of evolutionary problems exhibiting periodic solutions, i.e. partial differential equations and Volterra integral equations.

  13. Human evolutionary genomics: ethical and interpretive issues.

    PubMed

    Vitti, Joseph J; Cho, Mildred K; Tishkoff, Sarah A; Sabeti, Pardis C

    2012-03-01

    Genome-wide computational studies can now identify targets of natural selection. The unique information about humans these studies reveal, and the media attention they attract, indicate the need for caution and precision in communicating results. This need is exacerbated by ways in which evolutionary and genetic considerations have been misapplied to support discriminatory policies, by persistent misconceptions of these fields and by the social sensitivity surrounding discussions of racial ancestry. We discuss the foundations, accomplishments and future directions of human evolutionary genomics, attending to ways in which the interpretation of good science can go awry, and offer suggestions for researchers to prevent misapplication of their work. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Inferring population history with DIY ABC: a user-friendly approach to approximate Bayesian computation.

    PubMed

    Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A; Robert, Christian P; Marin, Jean-Michel; Balding, David J; Guillemaud, Thomas; Estoup, Arnaud

    2008-12-01

    Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc.

  15. Image coding using entropy-constrained residual vector quantization

    NASA Technical Reports Server (NTRS)

    Kossentini, Faouzi; Smith, Mark J. T.; Barnes, Christopher F.

    1993-01-01

    The residual vector quantization (RVQ) structure is exploited to produce a variable length codeword RVQ. Necessary conditions for the optimality of this RVQ are presented, and a new entropy-constrained RVQ (ECRVQ) design algorithm is shown to be very effective in designing RVQ codebooks over a wide range of bit rates and vector sizes. The new EC-RVQ has several important advantages. It can outperform entropy-constrained VQ (ECVQ) in terms of peak signal-to-noise ratio (PSNR), memory, and computation requirements. It can also be used to design high rate codebooks and codebooks with relatively large vector sizes. Experimental results indicate that when the new EC-RVQ is applied to image coding, very high quality is achieved at relatively low bit rates.

  16. Evolving Non-Dominated Parameter Sets for Computational Models from Multiple Experiments

    NASA Astrophysics Data System (ADS)

    Lane, Peter C. R.; Gobet, Fernand

    2013-03-01

    Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the `speciated non-dominated sorting genetic algorithm' for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.

  17. A program to compute the soft Robinson-Foulds distance between phylogenetic networks.

    PubMed

    Lu, Bingxin; Zhang, Louxin; Leong, Hon Wai

    2017-03-14

    Over the past two decades, phylogenetic networks have been studied to model reticulate evolutionary events. The relationships among phylogenetic networks, phylogenetic trees and clusters serve as the basis for reconstruction and comparison of phylogenetic networks. To understand these relationships, two problems are raised: the tree containment problem, which asks whether a phylogenetic tree is displayed in a phylogenetic network, and the cluster containment problem, which asks whether a cluster is represented at a node in a phylogenetic network. Both the problems are NP-complete. A fast exponential-time algorithm for the cluster containment problem on arbitrary networks is developed and implemented in C. The resulting program is further extended into a computer program for fast computation of the Soft Robinson-Foulds distance between phylogenetic networks. Two computer programs are developed for facilitating reconstruction and validation of phylogenetic network models in evolutionary and comparative genomics. Our simulation tests indicated that they are fast enough for use in practice. Additionally, the distribution of the Soft Robinson-Foulds distance between phylogenetic networks is demonstrated to be unlikely normal by our simulation data.

  18. 77 FR 46749 - Tests Determined To Be Suitable for Use in the National Reporting System for Adult Education

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-06

    ... Student Assessment Systems (CASAS) Life Skills Math Assessments--Application of Mathematics (Secondary... Proficiency Test (MAPT) for Math. This test is approved for use through a computer-adaptive delivery format...) Employability Competency System (ECS) Math Assessments--Workforce Learning Systems (WLS). Forms 11, 12, 13, 14...

  19. 76 FR 56188 - Tests Determined To Be Suitable for Use in the National Reporting System for Adult Education

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-12

    ... (CASAS) Life Skills Math Assessments--Application of Mathematics (Secondary Level). We are clarifying... Proficiency Test (MAPT) for Math. We are clarifying that the computer-adaptive test (CAT) is an approved...): (1) Comprehensive Adult Student Assessment Systems (CASAS) Employability Competency System (ECS) Math...

  20. Operational Test and Evaluation Handbook for Aircrew Training Devices. Volume I. Planning and Management.

    DTIC Science & Technology

    1982-02-01

    of i, nd to (! Lvel op an awareness of the T&E roles and responsioi Ii ties Viir~dte various Air Force organizations involved in the T&EC process... mathematical models to determine controller messages and issue controller messages using computer generated speech. AUTOMATED PERFORMANCE ALERTS: Signals

  1. Multiobjective Multifactorial Optimization in Evolutionary Multitasking.

    PubMed

    Gupta, Abhishek; Ong, Yew-Soon; Feng, Liang; Tan, Kay Chen

    2016-05-03

    In recent decades, the field of multiobjective optimization has attracted considerable interest among evolutionary computation researchers. One of the main features that makes evolutionary methods particularly appealing for multiobjective problems is the implicit parallelism offered by a population, which enables simultaneous convergence toward the entire Pareto front. While a plethora of related algorithms have been proposed till date, a common attribute among them is that they focus on efficiently solving only a single optimization problem at a time. Despite the known power of implicit parallelism, seldom has an attempt been made to multitask, i.e., to solve multiple optimization problems simultaneously. It is contended that the notion of evolutionary multitasking leads to the possibility of automated transfer of information across different optimization exercises that may share underlying similarities, thereby facilitating improved convergence characteristics. In particular, the potential for automated transfer is deemed invaluable from the standpoint of engineering design exercises where manual knowledge adaptation and reuse are routine. Accordingly, in this paper, we present a realization of the evolutionary multitasking paradigm within the domain of multiobjective optimization. The efficacy of the associated evolutionary algorithm is demonstrated on some benchmark test functions as well as on a real-world manufacturing process design problem from the composites industry.

  2. A multi-platform evaluation of the randomized CX low-rank matrix factorization in Spark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gittens, Alex; Kottalam, Jey; Yang, Jiyan

    We investigate the performance and scalability of the randomized CX low-rank matrix factorization and demonstrate its applicability through the analysis of a 1TB mass spectrometry imaging (MSI) dataset, using Apache Spark on an Amazon EC2 cluster, a Cray XC40 system, and an experimental Cray cluster. We implemented this factorization both as a parallelized C implementation with hand-tuned optimizations and in Scala using the Apache Spark high-level cluster computing framework. We obtained consistent performance across the three platforms: using Spark we were able to process the 1TB size dataset in under 30 minutes with 960 cores on all systems, with themore » fastest times obtained on the experimental Cray cluster. In comparison, the C implementation was 21X faster on the Amazon EC2 system, due to careful cache optimizations, bandwidth-friendly access of matrices and vector computation using SIMD units. We report these results and their implications on the hardware and software issues arising in supporting data-centric workloads in parallel and distributed environments.« less

  3. A high performance scientific cloud computing environment for materials simulations

    NASA Astrophysics Data System (ADS)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  4. Replaying evolutionary transitions from the dental fossil record

    PubMed Central

    Harjunmaa, Enni; Seidel, Kerstin; Häkkinen, Teemu; Renvoisé, Elodie; Corfe, Ian J.; Kallonen, Aki; Zhang, Zhao-Qun; Evans, Alistair R.; Mikkola, Marja L.; Salazar-Ciudad, Isaac; Klein, Ophir D.; Jernvall, Jukka

    2014-01-01

    The evolutionary relationships of extinct species are ascertained primarily through the analysis of morphological characters. Character inter-dependencies can have a substantial effect on evolutionary interpretations, but the developmental underpinnings of character inter-dependence remain obscure because experiments frequently do not provide detailed resolution of morphological characters. Here we show experimentally and computationally how gradual modification of development differentially affects characters in the mouse dentition. We found that intermediate phenotypes could be produced by gradually adding ectodysplasin A (EDA) protein in culture to tooth explants carrying a null mutation in the tooth-patterning gene Eda. By identifying development-based character interdependencies, we show how to predict morphological patterns of teeth among mammalian species. Finally, in vivo inhibition of sonic hedgehog signalling in Eda null teeth enabled us to reproduce characters deep in the rodent ancestry. Taken together, evolutionarily informative transitions can be experimentally reproduced, thereby providing development-based expectations for character state transitions used in evolutionary studies. PMID:25079326

  5. Selection on Network Dynamics Drives Differential Rates of Protein Domain Evolution

    PubMed Central

    Mannakee, Brian K.; Gutenkunst, Ryan N.

    2016-01-01

    The long-held principle that functionally important proteins evolve slowly has recently been challenged by studies in mice and yeast showing that the severity of a protein knockout only weakly predicts that protein’s rate of evolution. However, the relevance of these studies to evolutionary changes within proteins is unknown, because amino acid substitutions, unlike knockouts, often only slightly perturb protein activity. To quantify the phenotypic effect of small biochemical perturbations, we developed an approach to use computational systems biology models to measure the influence of individual reaction rate constants on network dynamics. We show that this dynamical influence is predictive of protein domain evolutionary rate within networks in vertebrates and yeast, even after controlling for expression level and breadth, network topology, and knockout effect. Thus, our results not only demonstrate the importance of protein domain function in determining evolutionary rate, but also the power of systems biology modeling to uncover unanticipated evolutionary forces. PMID:27380265

  6. Incorporating evolutionary processes into population viability models.

    PubMed

    Pierson, Jennifer C; Beissinger, Steven R; Bragg, Jason G; Coates, David J; Oostermeijer, J Gerard B; Sunnucks, Paul; Schumaker, Nathan H; Trotter, Meredith V; Young, Andrew G

    2015-06-01

    We examined how ecological and evolutionary (eco-evo) processes in population dynamics could be better integrated into population viability analysis (PVA). Complementary advances in computation and population genomics can be combined into an eco-evo PVA to offer powerful new approaches to understand the influence of evolutionary processes on population persistence. We developed the mechanistic basis of an eco-evo PVA using individual-based models with individual-level genotype tracking and dynamic genotype-phenotype mapping to model emergent population-level effects, such as local adaptation and genetic rescue. We then outline how genomics can allow or improve parameter estimation for PVA models by providing genotypic information at large numbers of loci for neutral and functional genome regions. As climate change and other threatening processes increase in rate and scale, eco-evo PVAs will become essential research tools to evaluate the effects of adaptive potential, evolutionary rescue, and locally adapted traits on persistence. © 2014 Society for Conservation Biology.

  7. Historical Contingency in Controlled Evolution

    NASA Astrophysics Data System (ADS)

    Schuster, Peter

    2014-12-01

    A basic question in evolution is dealing with the nature of an evolutionary memory. At thermodynamic equilibrium, at stable stationary states or other stable attractors the memory on the path leading to the long-time solution is erased, at least in part. Similar arguments hold for unique optima. Optimality in biology is discussed on the basis of microbial metabolism. Biology, on the other hand, is characterized by historical contingency, which has recently become accessible to experimental test in bacterial populations evolving under controlled conditions. Computer simulations give additional insight into the nature of the evolutionary memory, which is ultimately caused by the enormous space of possibilities that is so large that it escapes all attempts of visualization. In essence, this contribution is dealing with two questions of current evolutionary theory: (i) Are organisms operating at optimal performance? and (ii) How is the evolutionary memory built up in populations?

  8. Modeling drain current of indium zinc oxide thin film transistors prepared by solution deposition technique

    NASA Astrophysics Data System (ADS)

    Qiang, Lei; Liang, Xiaoci; Cai, Guangshuo; Pei, Yanli; Yao, Ruohe; Wang, Gang

    2018-06-01

    Indium zinc oxide (IZO) thin film transistor (TFT) deposited by solution method is of considerable technological interest as it is a key component for the fabrication of flexible and cheap transparent electronic devices. To obtain a principal understanding of physical properties of solution-processed IZO TFT, a new drain current model that account for the charge transport is proposed. The formulation is developed by incorporating the effect of gate voltage on mobility and threshold voltage with the carrier charges. It is demonstrated that in IZO TFTs the below threshold regime should be divided into two sections: EC - EF > 3kT and EC - EF ≤ 3kT, where kT is the thermal energy, EF and EC represent the Fermi level and the conduction band edge, respectively. Additionally, in order to describe conduction mechanisms more accurately, the extended mobility edge model is conjoined, which can also get rid of the complicated and lengthy computations. The good agreement between measured and calculated results confirms the efficiency of this model for the design of integrated large-area thin film circuits.

  9. Human vs. Computer Diagnosis of Students' Natural Selection Knowledge: Testing the Efficacy of Text Analytic Software

    NASA Astrophysics Data System (ADS)

    Nehm, Ross H.; Haertig, Hendrik

    2012-02-01

    Our study examines the efficacy of Computer Assisted Scoring (CAS) of open-response text relative to expert human scoring within the complex domain of evolutionary biology. Specifically, we explored whether CAS can diagnose the explanatory elements (or Key Concepts) that comprise undergraduate students' explanatory models of natural selection with equal fidelity as expert human scorers in a sample of >1,000 essays. We used SPSS Text Analysis 3.0 to perform our CAS and measure Kappa values (inter-rater reliability) of KC detection (i.e., computer-human rating correspondence). Our first analysis indicated that the text analysis functions (or extraction rules) developed and deployed in SPSSTA to extract individual Key Concepts (KCs) from three different items differing in several surface features (e.g., taxon, trait, type of evolutionary change) produced "substantial" (Kappa 0.61-0.80) or "almost perfect" (0.81-1.00) agreement. The second analysis explored the measurement of human-computer correspondence for KC diversity (the number of different accurate knowledge elements) in the combined sample of all 827 essays. Here we found outstanding correspondence; extraction rules generated using one prompt type are broadly applicable to other evolutionary scenarios (e.g., bacterial resistance, cheetah running speed, etc.). This result is encouraging, as it suggests that the development of new item sets may not necessitate the development of new text analysis rules. Overall, our findings suggest that CAS tools such as SPSS Text Analysis may compensate for some of the intrinsic limitations of currently used multiple-choice Concept Inventories designed to measure student knowledge of natural selection.

  10. Computing the origin and evolution of the ribosome from its structure — Uncovering processes of macromolecular accretion benefiting synthetic biology

    PubMed Central

    Caetano-Anollés, Gustavo; Caetano-Anollés, Derek

    2015-01-01

    Accretion occurs pervasively in nature at widely different timeframes. The process also manifests in the evolution of macromolecules. Here we review recent computational and structural biology studies of evolutionary accretion that make use of the ideographic (historical, retrodictive) and nomothetic (universal, predictive) scientific frameworks. Computational studies uncover explicit timelines of accretion of structural parts in molecular repertoires and molecules. Phylogenetic trees of protein structural domains and proteomes and their molecular functions were built from a genomic census of millions of encoded proteins and associated terminal Gene Ontology terms. Trees reveal a ‘metabolic-first’ origin of proteins, the late development of translation, and a patchwork distribution of proteins in biological networks mediated by molecular recruitment. Similarly, the natural history of ancient RNA molecules inferred from trees of molecular substructures built from a census of molecular features shows patchwork-like accretion patterns. Ideographic analyses of ribosomal history uncover the early appearance of structures supporting mRNA decoding and tRNA translocation, the coevolution of ribosomal proteins and RNA, and a first evolutionary transition that brings ribosomal subunits together into a processive protein biosynthetic complex. Nomothetic structural biology studies of tertiary interactions and ancient insertions in rRNA complement these findings, once concentric layering assumptions are removed. Patterns of coaxial helical stacking reveal a frustrated dynamics of outward and inward ribosomal growth possibly mediated by structural grafting. The early rise of the ribosomal ‘turnstile’ suggests an evolutionary transition in natural biological computation. Results make explicit the need to understand processes of molecular growth and information transfer of macromolecules. PMID:27096056

  11. Evolutionary history of the enolase gene family.

    PubMed

    Tracy, M R; Hedges, S B

    2000-12-23

    The enzyme enolase [EC 4.2.1.11] is found in all organisms, with vertebrates exhibiting tissue-specific isozymes encoded by three genes: alpha (alpha), beta (beta), and gamma (gamma) enolase. Limited taxonomic sampling of enolase has obscured the timing of gene duplication events. To help clarify the evolutionary history of the gene family, cDNAs were sequenced from six taxa representing major lineages of vertebrates: Chiloscyllium punctatum (shark), Amia calva (bowfin), Salmo trutta (trout), Latimeria chalumnae (coelacanth), Lepidosiren paradoxa (South American lungfish), and Neoceratodus forsteri (Australian lungfish). Phylogenetic analysis of all enolase and related gene sequences revealed an early gene duplication event prior to the last common ancestor of living organisms. Several distantly related archaebacterial sequences were designated as 'enolase-2', whereas all other enolase sequences were designated 'enolase-1'. Two of the three isozymes of enolase-1, alpha- and beta-enolase, were discovered in actinopterygian, sarcopterygian, and chondrichthian fishes. Phylogenetic analysis of vertebrate enolases revealed that the two gene duplications leading to the three isozymes of enolase-1 occurred subsequent to the divergence of living agnathans, near the Proterozoic/Phanerozoic boundary (approximately 550Mya). Two copies of enolase, designated alpha(1) and alpha(2), were found in the trout and are presumed to be the result of a genome duplication event.

  12. De novo assembly of the transcriptome of Aegiceras corniculatum, a mangrove species in the Indo-West Pacific region.

    PubMed

    Fang, Lu; Yang, Yuchen; Guo, Wuxia; Li, Jianfang; Zhong, Cairong; Huang, Yelin; Zhou, Renchao; Shi, Suhua

    2016-08-01

    Aegiceras corniculatum (L.) Blanco is one of the most salt tolerant mangrove species and can thrive in 3% salinity at the seaward edge of mangrove forests. Here we sequenced the transcriptome of A. corniculatum used Illumina GA platform to develop its genomic resources for ecological and evolutionary studies. We obtained about 50 million high-quality paired-end reads with 75bp in length. Using the short read assembler Velvet, we yielded 49,437 contigs with the average length of 625bp. A total of 32,744 (66.23%) contigs showed significant similarity to the GenBank non-redundant (NR) protein database. 30,911 and 18,004 of these sequences were assigned to Gene Ontology and eukaryotic orthologous groups of proteins (KOG). A total of 4942 transcripts from our assemblies had significant similarity with KEGG Orthologs and were involved in 144 KEGG pathways, while 9899 unigenes had enzyme commission (EC) numbers. In addition, 9792 transcriptome-derived SSRs were identified from 7342 sequences. With our strict criteria, 4165 candidate SNPs were also identified from 2058 contigs. Some of these SNPs were further validated by Sanger sequencing. Genomic resources generated in this study should be valuable in ecological, evolutionary, and functional genomics studies for this mangrove species. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Computer-automated evolution of an X-band antenna for NASA's Space Technology 5 mission.

    PubMed

    Hornby, Gregory S; Lohn, Jason D; Linden, Derek S

    2011-01-01

    Whereas the current practice of designing antennas by hand is severely limited because it is both time and labor intensive and requires a significant amount of domain knowledge, evolutionary algorithms can be used to search the design space and automatically find novel antenna designs that are more effective than would otherwise be developed. Here we present our work in using evolutionary algorithms to automatically design an X-band antenna for NASA's Space Technology 5 (ST5) spacecraft. Two evolutionary algorithms were used: the first uses a vector of real-valued parameters and the second uses a tree-structured generative representation for constructing the antenna. The highest-performance antennas from both algorithms were fabricated and tested and both outperformed a hand-designed antenna produced by the antenna contractor for the mission. Subsequent changes to the spacecraft orbit resulted in a change in requirements for the spacecraft antenna. By adjusting our fitness function we were able to rapidly evolve a new set of antennas for this mission in less than a month. One of these new antenna designs was built, tested, and approved for deployment on the three ST5 spacecraft, which were successfully launched into space on March 22, 2006. This evolved antenna design is the first computer-evolved antenna to be deployed for any application and is the first computer-evolved hardware in space.

  14. Unified model of brain tissue microstructure dynamically binds diffusion and osmosis with extracellular space geometry

    NASA Astrophysics Data System (ADS)

    Yousefnezhad, Mohsen; Fotouhi, Morteza; Vejdani, Kaveh; Kamali-Zare, Padideh

    2016-09-01

    We present a universal model of brain tissue microstructure that dynamically links osmosis and diffusion with geometrical parameters of brain extracellular space (ECS). Our model robustly describes and predicts the nonlinear time dependency of tortuosity (λ =√{D /D* } ) changes with very high precision in various media with uniform and nonuniform osmolarity distribution, as demonstrated by previously published experimental data (D = free diffusion coefficient, D* = effective diffusion coefficient). To construct this model, we first developed a multiscale technique for computationally effective modeling of osmolarity in the brain tissue. Osmolarity differences across cell membranes lead to changes in the ECS dynamics. The evolution of the underlying dynamics is then captured by a level set method. Subsequently, using a homogenization technique, we derived a coarse-grained model with parameters that are explicitly related to the geometry of cells and their associated ECS. Our modeling results in very accurate analytical approximation of tortuosity based on time, space, osmolarity differences across cell membranes, and water permeability of cell membranes. Our model provides a unique platform for studying ECS dynamics not only in physiologic conditions such as sleep-wake cycles and aging but also in pathologic conditions such as stroke, seizure, and neoplasia, as well as in predictive pharmacokinetic modeling such as predicting medication biodistribution and efficacy and novel biomolecule development and testing.

  15. A Comparative Study of Electric Load Curve Changes in an Urban Low-Voltage Substation in Spain during the Economic Crisis (2008–2013)

    PubMed Central

    Lara-Santillán, Pedro M.; Mendoza-Villena, Montserrat; Fernández-Jiménez, L. Alfredo; Mañana-Canteli, Mario

    2014-01-01

    This paper presents a comparative study of the electricity consumption (EC) in an urban low-voltage substation before and during the economic crisis (2008–2013). This low-voltage substation supplies electric power to near 400 users. The EC was measured for an 11-year period (2002–2012) with a sampling time of 1 minute. The study described in the paper consists of detecting the changes produced in the load curves of this substation along the time due to changes in the behaviour of consumers. The EC was compared using representative curves per time period (precrisis and crisis). These representative curves were obtained after a computational process, which was based on a search for days with similar curves to the curve of a determined (base) date. This similitude was assessed by the proximity on the calendar, day of the week, daylight time, and outdoor temperature. The last selection parameter was the error between the nearest neighbour curves and the base date curve. The obtained representative curves were linearized to determine changes in their structure (maximum and minimum consumption values, duration of the daily time slot, etc.). The results primarily indicate an increase in the EC in the night slot during the summer months in the crisis period. PMID:24895677

  16. Determining the effective coverage of maternal and child health services in Kenya, using demographic and health survey data sets: tracking progress towards universal health coverage.

    PubMed

    Nguhiu, Peter K; Barasa, Edwine W; Chuma, Jane

    2017-04-01

    Effective coverage (EC) is a measure of health systems' performance that combines need, use and quality indicators. This study aimed to assess the extent to which the Kenyan health system provides effective and equitable maternal and child health services, as a means of tracking the country's progress towards universal health coverage. The Demographic Health Surveys (2003, 2008-2009 and 2014) and Service Provision Assessment surveys (2004, 2010) were the main sources of data. Indicators of need, use and quality for eight maternal and child health interventions were aggregated across interventions and economic quintiles to compute EC. EC has increased from 26.7% in 2003 to 50.9% in 2014, but remains low for the majority of interventions. There is a reduction in economic inequalities in EC with the highest to lowest wealth quintile ratio decreasing from 2.41 in 2003 to 1.65 in 2014, but maternal health services remain highly inequitable. Effective coverage of key maternal and child health services remains low, indicating that individuals are not receiving the maximum possible health gain from existing health services. There is an urgent need to focus on the quality and reach of maternal and child health services in Kenya to achieve the goals of universal health coverage. © 2017 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  17. Improving Search Properties in Genetic Programming

    NASA Technical Reports Server (NTRS)

    Janikow, Cezary Z.; DeWeese, Scott

    1997-01-01

    With the advancing computer processing capabilities, practical computer applications are mostly limited by the amount of human programming required to accomplish a specific task. This necessary human participation creates many problems, such as dramatically increased cost. To alleviate the problem, computers must become more autonomous. In other words, computers must be capable to program/reprogram themselves to adapt to changing environments/tasks/demands/domains. Evolutionary computation offers potential means, but it must be advanced beyond its current practical limitations. Evolutionary algorithms model nature. They maintain a population of structures representing potential solutions to the problem at hand. These structures undergo a simulated evolution by means of mutation, crossover, and a Darwinian selective pressure. Genetic programming (GP) is the most promising example of an evolutionary algorithm. In GP, the structures that evolve are trees, which is a dramatic departure from previously used representations such as strings in genetic algorithms. The space of potential trees is defined by means of their elements: functions, which label internal nodes, and terminals, which label leaves. By attaching semantic interpretation to those elements, trees can be interpreted as computer programs (given an interpreter), evolved architectures, etc. JSC has begun exploring GP as a potential tool for its long-term project on evolving dextrous robotic capabilities. Last year we identified representation redundancies as the primary source of inefficiency in GP. Subsequently, we proposed a method to use problem constraints to reduce those redundancies, effectively reducing GP complexity. This method was implemented afterwards at the University of Missouri. This summer, we have evaluated the payoff from using problem constraints to reduce search complexity on two classes of problems: learning boolean functions and solving the forward kinematics problem. We have also developed and implemented methods to use additional problem heuristics to fine-tune the searchable space, and to use typing information to further reduce the search space. Additional improvements have been proposed, but they are yet to be explored and implemented.

  18. A Computational Model of the Ionic Currents, Ca2+ Dynamics and Action Potentials Underlying Contraction of Isolated Uterine Smooth Muscle

    PubMed Central

    Tong, Wing-Chiu; Choi, Cecilia Y.; Karche, Sanjay; Holden, Arun V.; Zhang, Henggui; Taggart, Michael J.

    2011-01-01

    Uterine contractions during labor are discretely regulated by rhythmic action potentials (AP) of varying duration and form that serve to determine calcium-dependent force production. We have employed a computational biology approach to develop a fuller understanding of the complexity of excitation-contraction (E-C) coupling of uterine smooth muscle cells (USMC). Our overall aim is to establish a mathematical platform of sufficient biophysical detail to quantitatively describe known uterine E-C coupling parameters and thereby inform future empirical investigations of physiological and pathophysiological mechanisms governing normal and dysfunctional labors. From published and unpublished data we construct mathematical models for fourteen ionic currents of USMCs: currents (L- and T-type), current, an hyperpolarization-activated current, three voltage-gated currents, two -activated current, -activated current, non-specific cation current, - exchanger, - pump and background current. The magnitudes and kinetics of each current system in a spindle shaped single cell with a specified surface area∶volume ratio is described by differential equations, in terms of maximal conductances, electrochemical gradient, voltage-dependent activation/inactivation gating variables and temporal changes in intracellular computed from known fluxes. These quantifications are validated by the reconstruction of the individual experimental ionic currents obtained under voltage-clamp. Phasic contraction is modeled in relation to the time constant of changing . This integrated model is validated by its reconstruction of the different USMC AP configurations (spikes, plateau and bursts of spikes), the change from bursting to plateau type AP produced by estradiol and of simultaneous experimental recordings of spontaneous AP, and phasic force. In summary, our advanced mathematical model provides a powerful tool to investigate the physiological ionic mechanisms underlying the genesis of uterine electrical E-C coupling of labor and parturition. This will furnish the evolution of descriptive and predictive quantitative models of myometrial electrogenesis at the whole cell and tissue levels. PMID:21559514

  19. The role of dedicated data computing centers in the age of cloud computing

    NASA Astrophysics Data System (ADS)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  20. Robust electromagnetically guided endoscopic procedure using enhanced particle swarm optimization for multimodal information fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Xiongbiao, E-mail: xluo@robarts.ca, E-mail: Ying.Wan@student.uts.edu.au; Wan, Ying, E-mail: xluo@robarts.ca, E-mail: Ying.Wan@student.uts.edu.au; He, Xiangjian

    Purpose: Electromagnetically guided endoscopic procedure, which aims at accurately and robustly localizing the endoscope, involves multimodal sensory information during interventions. However, it still remains challenging in how to integrate these information for precise and stable endoscopic guidance. To tackle such a challenge, this paper proposes a new framework on the basis of an enhanced particle swarm optimization method to effectively fuse these information for accurate and continuous endoscope localization. Methods: The authors use the particle swarm optimization method, which is one of stochastic evolutionary computation algorithms, to effectively fuse the multimodal information including preoperative information (i.e., computed tomography images) asmore » a frame of reference, endoscopic camera videos, and positional sensor measurements (i.e., electromagnetic sensor outputs). Since the evolutionary computation method usually limits its possible premature convergence and evolutionary factors, the authors introduce the current (endoscopic camera and electromagnetic sensor’s) observation to boost the particle swarm optimization and also adaptively update evolutionary parameters in accordance with spatial constraints and the current observation, resulting in advantageous performance in the enhanced algorithm. Results: The experimental results demonstrate that the authors’ proposed method provides a more accurate and robust endoscopic guidance framework than state-of-the-art methods. The average guidance accuracy of the authors’ framework was about 3.0 mm and 5.6° while the previous methods show at least 3.9 mm and 7.0°. The average position and orientation smoothness of their method was 1.0 mm and 1.6°, which is significantly better than the other methods at least with (2.0 mm and 2.6°). Additionally, the average visual quality of the endoscopic guidance was improved to 0.29. Conclusions: A robust electromagnetically guided endoscopy framework was proposed on the basis of an enhanced particle swarm optimization method with using the current observation information and adaptive evolutionary factors. The authors proposed framework greatly reduced the guidance errors from (4.3, 7.8) to (3.0 mm, 5.6°), compared to state-of-the-art methods.« less

  1. New Genetics

    MedlinePlus

    ... Century-Old Evolutionary Puzzle Computing Genetics Model Organisms RNA Interference The New Genetics is a science education ... the basics of DNA and its molecular cousin RNA, and new directions in genetic research. The New ...

  2. An Evolutionary Examination of Telemedicine: A Health and Computer-Mediated Communication Perspective

    PubMed Central

    Breen, Gerald-Mark; Matusitz, Jonathan

    2009-01-01

    Telemedicine, the use of advanced communication technologies in the healthcare context, has a rich history and a clear evolutionary course. In this paper, the authors identify telemedicine as operationally defined, the services and technologies it comprises, the direction telemedicine has taken, along with its increased acceptance in the healthcare communities. The authors also describe some of the key pitfalls warred with by researchers and activists to advance telemedicine to its full potential and lead to an unobstructed team of technicians to identify telemedicine’s diverse utilities. A discussion and future directions section is included to provide fresh ideas to health communication and computer-mediated scholars wishing to delve into this area and make a difference to enhance public understanding of this field. PMID:20300559

  3. The Chomsky—Place correspondence 1993–1994

    PubMed Central

    Chomsky, Noam; Place, Ullin T.

    2000-01-01

    Edited correspondence between Ullin T. Place and Noam Chomsky, which occurred in 1993–1994, is presented. The principal topics are (a) deep versus surface structure; (b) computer modeling of the brain; (c) the evolutionary origins of language; (d) behaviorism; and (e) a dispositional account of language. This correspondence includes Chomsky's denial that he ever characterized deep structure as innate; Chomsky's critique of computer modeling (both traditional and connectionist) of the brain; Place's critique of Chomsky's alleged failure to provide an adequate account of the evolutionary origins of language, and Chomsky's response that such accounts are “pop-Darwinian fairy tales”; and Place's arguments for, and Chomsky's against, the relevance of behaviorism to linguistic theory, especially the relevance of a behavioral approach to language that is buttressed by a dispositional account of sentence construction. PMID:22477211

  4. The Chomsky-Place correspondence 1993-1994.

    PubMed

    Chomsky, N; Place, U T

    2000-01-01

    Edited correspondence between Ullin T. Place and Noam Chomsky, which occurred in 1993-1994, is presented. The principal topics are (a) deep versus surface structure; (b) computer modeling of the brain; (c) the evolutionary origins of language; (d) behaviorism; and (e) a dispositional account of language. This correspondence includes Chomsky's denial that he ever characterized deep structure as innate; Chomsky's critique of computer modeling (both traditional and connectionist) of the brain; Place's critique of Chomsky's alleged failure to provide an adequate account of the evolutionary origins of language, and Chomsky's response that such accounts are "pop-Darwinian fairy tales"; and Place's arguments for, and Chomsky's against, the relevance of behaviorism to linguistic theory, especially the relevance of a behavioral approach to language that is buttressed by a dispositional account of sentence construction.

  5. The development of the red giant branch. I - Theoretical evolutionary sequences

    NASA Technical Reports Server (NTRS)

    Sweigart, Allen V.; Greggio, Laura; Renzini, Alvio

    1989-01-01

    A grid of 100 evolutionary sequences extending from the zero-age main sequence to the onset of helium burning has been computed for stellar masses between 1.4 and 3.4 solar masses, helium abundances of 0.20 and 0.30, and heavy-element abundances of 0.004, 0.01, and 0.04. Using these computations the transition in the morphology of the red giant branch (RGB) between low-mass stars, which have an extended and luminous first RGB phase prior to helium ignition, and intermediate-mass stars, which do not, is investigated. Extensive tabulations of the numerical results are provided to aid in applying these sequences. The effects of the first dredge-up on the surface helium and CNO abundances of the sequences is discussed.

  6. Toward a method for tracking virus evolutionary trajectory applied to the pandemic H1N1 2009 influenza virus.

    PubMed

    Squires, R Burke; Pickett, Brett E; Das, Sajal; Scheuermann, Richard H

    2014-12-01

    In 2009 a novel pandemic H1N1 influenza virus (H1N1pdm09) emerged as the first official influenza pandemic of the 21st century. Early genomic sequence analysis pointed to the swine origin of the virus. Here we report a novel computational approach to determine the evolutionary trajectory of viral sequences that uses data-driven estimations of nucleotide substitution rates to track the gradual accumulation of observed sequence alterations over time. Phylogenetic analysis and multiple sequence alignments show that sequences belonging to the resulting evolutionary trajectory of the H1N1pdm09 lineage exhibit a gradual accumulation of sequence variations and tight temporal correlations in the topological structure of the phylogenetic trees. These results suggest that our evolutionary trajectory analysis (ETA) can more effectively pinpoint the evolutionary history of viruses, including the host and geographical location traversed by each segment, when compared against either BLAST or traditional phylogenetic analysis alone. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. An Orthogonal Evolutionary Algorithm With Learning Automata for Multiobjective Optimization.

    PubMed

    Dai, Cai; Wang, Yuping; Ye, Miao; Xue, Xingsi; Liu, Hailin

    2016-12-01

    Research on multiobjective optimization problems becomes one of the hottest topics of intelligent computation. In order to improve the search efficiency of an evolutionary algorithm and maintain the diversity of solutions, in this paper, the learning automata (LA) is first used for quantization orthogonal crossover (QOX), and a new fitness function based on decomposition is proposed to achieve these two purposes. Based on these, an orthogonal evolutionary algorithm with LA for complex multiobjective optimization problems with continuous variables is proposed. The experimental results show that in continuous states, the proposed algorithm is able to achieve accurate Pareto-optimal sets and wide Pareto-optimal fronts efficiently. Moreover, the comparison with the several existing well-known algorithms: nondominated sorting genetic algorithm II, decomposition-based multiobjective evolutionary algorithm, decomposition-based multiobjective evolutionary algorithm with an ensemble of neighborhood sizes, multiobjective optimization by LA, and multiobjective immune algorithm with nondominated neighbor-based selection, on 15 multiobjective benchmark problems, shows that the proposed algorithm is able to find more accurate and evenly distributed Pareto-optimal fronts than the compared ones.

  8. Development of an Evolutionary Algorithm for the ab Initio Discovery of Two-Dimensional Materials

    NASA Astrophysics Data System (ADS)

    Revard, Benjamin Charles

    Crystal structure prediction is an important first step on the path toward computational materials design. Increasingly robust methods have become available in recent years for computing many materials properties, but because properties are largely a function of crystal structure, the structure must be known before these methods can be brought to bear. In addition, structure prediction is particularly useful for identifying low-energy structures of subperiodic materials, such as two-dimensional (2D) materials, which may adopt unexpected structures that differ from those of the corresponding bulk phases. Evolutionary algorithms, which are heuristics for global optimization inspired by biological evolution, have proven to be a fruitful approach for tackling the problem of crystal structure prediction. This thesis describes the development of an improved evolutionary algorithm for structure prediction and several applications of the algorithm to predict the structures of novel low-energy 2D materials. The first part of this thesis contains an overview of evolutionary algorithms for crystal structure prediction and presents our implementation, including details of extending the algorithm to search for clusters, wires, and 2D materials, improvements to efficiency when running in parallel, improved composition space sampling, and the ability to search for partial phase diagrams. We then present several applications of the evolutionary algorithm to 2D systems, including InP, the C-Si and Sn-S phase diagrams, and several group-IV dioxides. This thesis makes use of the Cornell graduate school's "papers" option. Chapters 1 and 3 correspond to the first-author publications of Refs. [131] and [132], respectively, and chapter 2 will soon be submitted as a first-author publication. The material in chapter 4 is taken from Ref. [144], in which I share joint first-authorship. In this case I have included only my own contributions.

  9. Upon Accounting for the Impact of Isoenzyme Loss, Gene Deletion Costs Anticorrelate with Their Evolutionary Rates.

    PubMed

    Jacobs, Christopher; Lambourne, Luke; Xia, Yu; Segrè, Daniel

    2017-01-01

    System-level metabolic network models enable the computation of growth and metabolic phenotypes from an organism's genome. In particular, flux balance approaches have been used to estimate the contribution of individual metabolic genes to organismal fitness, offering the opportunity to test whether such contributions carry information about the evolutionary pressure on the corresponding genes. Previous failure to identify the expected negative correlation between such computed gene-loss cost and sequence-derived evolutionary rates in Saccharomyces cerevisiae has been ascribed to a real biological gap between a gene's fitness contribution to an organism "here and now" and the same gene's historical importance as evidenced by its accumulated mutations over millions of years of evolution. Here we show that this negative correlation does exist, and can be exposed by revisiting a broadly employed assumption of flux balance models. In particular, we introduce a new metric that we call "function-loss cost", which estimates the cost of a gene loss event as the total potential functional impairment caused by that loss. This new metric displays significant negative correlation with evolutionary rate, across several thousand minimal environments. We demonstrate that the improvement gained using function-loss cost over gene-loss cost is explained by replacing the base assumption that isoenzymes provide unlimited capacity for backup with the assumption that isoenzymes are completely non-redundant. We further show that this change of the assumption regarding isoenzymes increases the recall of epistatic interactions predicted by the flux balance model at the cost of a reduction in the precision of the predictions. In addition to suggesting that the gene-to-reaction mapping in genome-scale flux balance models should be used with caution, our analysis provides new evidence that evolutionary gene importance captures much more than strict essentiality.

  10. A Penalized Likelihood Framework For High-Dimensional Phylogenetic Comparative Methods And An Application To New-World Monkeys Brain Evolution.

    PubMed

    Julien, Clavel; Leandro, Aristide; Hélène, Morlon

    2018-06-19

    Working with high-dimensional phylogenetic comparative datasets is challenging because likelihood-based multivariate methods suffer from low statistical performances as the number of traits p approaches the number of species n and because some computational complications occur when p exceeds n. Alternative phylogenetic comparative methods have recently been proposed to deal with the large p small n scenario but their use and performances are limited. Here we develop a penalized likelihood framework to deal with high-dimensional comparative datasets. We propose various penalizations and methods for selecting the intensity of the penalties. We apply this general framework to the estimation of parameters (the evolutionary trait covariance matrix and parameters of the evolutionary model) and model comparison for the high-dimensional multivariate Brownian (BM), Early-burst (EB), Ornstein-Uhlenbeck (OU) and Pagel's lambda models. We show using simulations that our penalized likelihood approach dramatically improves the estimation of evolutionary trait covariance matrices and model parameters when p approaches n, and allows for their accurate estimation when p equals or exceeds n. In addition, we show that penalized likelihood models can be efficiently compared using Generalized Information Criterion (GIC). We implement these methods, as well as the related estimation of ancestral states and the computation of phylogenetic PCA in the R package RPANDA and mvMORPH. Finally, we illustrate the utility of the new proposed framework by evaluating evolutionary models fit, analyzing integration patterns, and reconstructing evolutionary trajectories for a high-dimensional 3-D dataset of brain shape in the New World monkeys. We find a clear support for an Early-burst model suggesting an early diversification of brain morphology during the ecological radiation of the clade. Penalized likelihood offers an efficient way to deal with high-dimensional multivariate comparative data.

  11. Energy and time determine scaling in biological and computer designs

    PubMed Central

    Bezerra, George; Edwards, Benjamin; Brown, James; Forrest, Stephanie

    2016-01-01

    Metabolic rate in animals and power consumption in computers are analogous quantities that scale similarly with size. We analyse vascular systems of mammals and on-chip networks of microprocessors, where natural selection and human engineering, respectively, have produced systems that minimize both energy dissipation and delivery times. Using a simple network model that simultaneously minimizes energy and time, our analysis explains empirically observed trends in the scaling of metabolic rate in mammals and power consumption and performance in microprocessors across several orders of magnitude in size. Just as the evolutionary transitions from unicellular to multicellular animals in biology are associated with shifts in metabolic scaling, our model suggests that the scaling of power and performance will change as computer designs transition to decentralized multi-core and distributed cyber-physical systems. More generally, a single energy–time minimization principle may govern the design of many complex systems that process energy, materials and information. This article is part of the themed issue ‘The major synthetic evolutionary transitions’. PMID:27431524

  12. Nemo: an evolutionary and population genetics programming framework.

    PubMed

    Guillaume, Frédéric; Rougemont, Jacques

    2006-10-15

    Nemo is an individual-based, genetically explicit and stochastic population computer program for the simulation of population genetics and life-history trait evolution in a metapopulation context. It comes as both a C++ programming framework and an executable program file. Its object-oriented programming design gives it the flexibility and extensibility needed to implement a large variety of forward-time evolutionary models. It provides developers with abstract models allowing them to implement their own life-history traits and life-cycle events. Nemo offers a large panel of population models, from the Island model to lattice models with demographic or environmental stochasticity and a variety of already implemented traits (deleterious mutations, neutral markers and more), life-cycle events (mating, dispersal, aging, selection, etc.) and output operators for saving data and statistics. It runs on all major computer platforms including parallel computing environments. The source code, binaries and documentation are available under the GNU General Public License at http://nemo2.sourceforge.net.

  13. Energy and time determine scaling in biological and computer designs.

    PubMed

    Moses, Melanie; Bezerra, George; Edwards, Benjamin; Brown, James; Forrest, Stephanie

    2016-08-19

    Metabolic rate in animals and power consumption in computers are analogous quantities that scale similarly with size. We analyse vascular systems of mammals and on-chip networks of microprocessors, where natural selection and human engineering, respectively, have produced systems that minimize both energy dissipation and delivery times. Using a simple network model that simultaneously minimizes energy and time, our analysis explains empirically observed trends in the scaling of metabolic rate in mammals and power consumption and performance in microprocessors across several orders of magnitude in size. Just as the evolutionary transitions from unicellular to multicellular animals in biology are associated with shifts in metabolic scaling, our model suggests that the scaling of power and performance will change as computer designs transition to decentralized multi-core and distributed cyber-physical systems. More generally, a single energy-time minimization principle may govern the design of many complex systems that process energy, materials and information.This article is part of the themed issue 'The major synthetic evolutionary transitions'. © 2016 The Author(s).

  14. On the interconnection of stable protein complexes: inter-complex hubs and their conservation in Saccharomyces cerevisiae and Homo sapiens networks.

    PubMed

    Guerra, Concettina

    2015-01-01

    Protein complexes are key molecular entities that perform a variety of essential cellular functions. The connectivity of proteins within a complex has been widely investigated with both experimental and computational techniques. We developed a computational approach to identify and characterise proteins that play a role in interconnecting complexes. We computed a measure of inter-complex centrality, the crossroad index, based on disjoint paths connecting proteins in distinct complexes and identified inter-complex hubs as proteins with a high value of the crossroad index. We applied the approach to a set of stable complexes in Saccharomyces cerevisiae and in Homo sapiens. Just as done for hubs, we evaluated the topological and biological properties of inter-complex hubs addressing the following questions. Do inter-complex hubs tend to be evolutionary conserved? What is the relation between crossroad index and essentiality? We found a good correlation between inter-complex hubs and both evolutionary conservation and essentiality.

  15. Comparative modeling of coevolution in communities of unicellular organisms: adaptability and biodiversity.

    PubMed

    Lashin, Sergey A; Suslov, Valentin V; Matushkin, Yuri G

    2010-06-01

    We propose an original program "Evolutionary constructor" that is capable of computationally efficient modeling of both population-genetic and ecological problems, combining these directions in one model of required detail level. We also present results of comparative modeling of stability, adaptability and biodiversity dynamics in populations of unicellular haploid organisms which form symbiotic ecosystems. The advantages and disadvantages of two evolutionary strategies of biota formation--a few generalists' taxa-based biota formation and biodiversity-based biota formation--are discussed.

  16. Rapid, Value-based, Evolutionary Acquisition and Its Application to a USMC Tactical Service Oriented Architecture

    DTIC Science & Technology

    2009-06-01

    Availability C2PC Command and Control Personal Computer CAS Close Air Support CCA Clinger-Cohen Act CDR Critical Design Review CJCSI Chairman of the Joint... kids , Jackie and Anna and my future boy whose name is TBD, I think my time at NPS has made me a better person and hopefully a better father. Thank... can the USMC apply the essential principles of rapid, value-based, evolutionary acquisition to the development and procurement of a TSOA? 4 THIS

  17. Laboratory evolution of protein conformational dynamics.

    PubMed

    Campbell, Eleanor C; Correy, Galen J; Mabbitt, Peter D; Buckle, Ashley M; Tokuriki, Nobuhiko; Jackson, Colin J

    2017-11-08

    This review focuses on recent work that has begun to establish specific functional roles for protein conformational dynamics, specifically how the conformational landscapes that proteins can sample can evolve under laboratory based evolutionary selection. We discuss recent technical advances in computational and biophysical chemistry, which have provided us with new ways to dissect evolutionary processes. Finally, we offer some perspectives on the emerging view of conformational dynamics and evolution, and the challenges that we face in rationally engineering conformational dynamics. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Reducing intrusive traumatic memories after emergency caesarean section: A proof-of-principle randomized controlled study.

    PubMed

    Horsch, Antje; Vial, Yvan; Favrod, Céline; Harari, Mathilde Morisod; Blackwell, Simon E; Watson, Peter; Iyadurai, Lalitha; Bonsall, Michael B; Holmes, Emily A

    2017-07-01

    Preventative psychological interventions to aid women after traumatic childbirth are needed. This proof-of-principle randomized controlled study evaluated whether the number of intrusive traumatic memories mothers experience after emergency caesarean section (ECS) could be reduced by a brief cognitive intervention. 56 women after ECS were randomized to one of two parallel groups in a 1:1 ratio: intervention (usual care plus cognitive task procedure) or control (usual care). The intervention group engaged in a visuospatial task (computer-game 'Tetris' via a handheld gaming device) for 15 min within six hours following their ECS. The primary outcome was the number of intrusive traumatic memories related to the ECS recorded in a diary for the week post-ECS. As predicted, compared with controls, the intervention group reported fewer intrusive traumatic memories (M = 4.77, SD = 10.71 vs. M = 9.22, SD = 10.69, d = 0.647 [95% CI: 0.106, 1.182]) over 1 week (intention-to-treat analyses, primary outcome). There was a trend towards reduced acute stress re-experiencing symptoms (d = 0.503 [95% CI: -0.032, 1.033]) after 1 week (intention-to-treat analyses). Times series analysis on daily intrusions data confirmed the predicted difference between groups. 72% of women rated the intervention "rather" to "extremely" acceptable. This represents a first step in the development of an early (and potentially universal) intervention to prevent postnatal posttraumatic stress symptoms that may benefit both mother and child. ClinicalTrials.gov, www.clinicaltrials.gov, NCT02502513. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Using eddy covariance to measure the dependence of air-sea CO2 exchange rate on friction velocity

    NASA Astrophysics Data System (ADS)

    Landwehr, Sebastian; Miller, Scott D.; Smith, Murray J.; Bell, Thomas G.; Saltzman, Eric S.; Ward, Brian

    2018-03-01

    Parameterisation of the air-sea gas transfer velocity of CO2 and other trace gases under open-ocean conditions has been a focus of air-sea interaction research and is required for accurately determining ocean carbon uptake. Ships are the most widely used platform for air-sea flux measurements but the quality of the data can be compromised by airflow distortion and sensor cross-sensitivity effects. Recent improvements in the understanding of these effects have led to enhanced corrections to the shipboard eddy covariance (EC) measurements.Here, we present a revised analysis of eddy covariance measurements of air-sea CO2 and momentum fluxes from the Southern Ocean Surface Ocean Aerosol Production (SOAP) study. We show that it is possible to significantly reduce the scatter in the EC data and achieve consistency between measurements taken on station and with the ship underway. The gas transfer velocities from the EC measurements correlate better with the EC friction velocity (u*) than with mean wind speeds derived from shipboard measurements corrected with an airflow distortion model. For the observed range of wind speeds (u10 N = 3-23 m s-1), the transfer velocities can be parameterised with a linear fit to u*. The SOAP data are compared to previous gas transfer parameterisations using u10 N computed from the EC friction velocity with the drag coefficient from the Coupled Ocean-Atmosphere Response Experiment (COARE) model version 3.5. The SOAP results are consistent with previous gas transfer studies, but at high wind speeds they do not support the sharp increase in gas transfer associated with bubble-mediated transfer predicted by physically based models.

  20. A computational model of pattern separation efficiency in the dentate gyrus with implications in schizophrenia

    PubMed Central

    Faghihi, Faramarz; Moustafa, Ahmed A.

    2015-01-01

    Information processing in the hippocampus begins by transferring spiking activity of the entorhinal cortex (EC) into the dentate gyrus (DG). Activity pattern in the EC is separated by the DG such that it plays an important role in hippocampal functions including memory. The structural and physiological parameters of these neural networks enable the hippocampus to be efficient in encoding a large number of inputs that animals receive and process in their life time. The neural encoding capacity of the DG depends on its single neurons encoding and pattern separation efficiency. In this study, encoding by the DG is modeled such that single neurons and pattern separation efficiency are measured using simulations of different parameter values. For this purpose, a probabilistic model of single neurons efficiency is presented to study the role of structural and physiological parameters. Known neurons number of the EC and the DG is used to construct a neural network by electrophysiological features of granule cells of the DG. Separated inputs as activated neurons in the EC with different firing probabilities are presented into the DG. For different connectivity rates between the EC and DG, pattern separation efficiency of the DG is measured. The results show that in the absence of feedback inhibition on the DG neurons, the DG demonstrates low separation efficiency and high firing frequency. Feedback inhibition can increase separation efficiency while resulting in very low single neuron’s encoding efficiency in the DG and very low firing frequency of neurons in the DG (sparse spiking). This work presents a mechanistic explanation for experimental observations in the hippocampus, in combination with theoretical measures. Moreover, the model predicts a critical role for impaired inhibitory neurons in schizophrenia where deficiency in pattern separation of the DG has been observed. PMID:25859189

  1. Human vs. Computer Diagnosis of Students' Natural Selection Knowledge: Testing the Efficacy of Text Analytic Software

    ERIC Educational Resources Information Center

    Nehm, Ross H.; Haertig, Hendrik

    2012-01-01

    Our study examines the efficacy of Computer Assisted Scoring (CAS) of open-response text relative to expert human scoring within the complex domain of evolutionary biology. Specifically, we explored whether CAS can diagnose the explanatory elements (or Key Concepts) that comprise undergraduate students' explanatory models of natural selection with…

  2. New double-byte error-correcting codes for memory systems

    NASA Technical Reports Server (NTRS)

    Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.

    1996-01-01

    Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.

  3. Hybrid evolutionary computing model for mobile agents of wireless Internet multimedia

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    2001-03-01

    The ecosystem is used as an evolutionary paradigm of natural laws for the distributed information retrieval via mobile agents to allow the computational load to be added to server nodes of wireless networks, while reducing the traffic on communication links. Based on the Food Web model, a set of computational rules of natural balance form the outer stage to control the evolution of mobile agents providing multimedia services with a wireless Internet protocol WIP. The evolutionary model shows how mobile agents should behave with the WIP, in particular, how mobile agents can cooperate, compete and learn from each other, based on an underlying competition for radio network resources to establish the wireless connections to support the quality of service QoS of user requests. Mobile agents are also allowed to clone themselves, propagate and communicate with other agents. A two-layer model is proposed for agent evolution: the outer layer is based on the law of natural balancing, the inner layer is based on a discrete version of a Kohonen self-organizing feature map SOFM to distribute network resources to meet QoS requirements. The former is embedded in the higher OSI layers of the WIP, while the latter is used in the resource management procedures of Layer 2 and 3 of the protocol. Algorithms for the distributed computation of mobile agent evolutionary behavior are developed by adding a learning state to the agent evolution state diagram. When an agent is in an indeterminate state, it can communicate to other agents. Computing models can be replicated from other agents. Then the agents transitions to the mutating state to wait for a new information-retrieval goal. When a wireless terminal or station lacks a network resource, an agent in the suspending state can change its policy to submit to the environment before it transitions to the searching state. The agents learn the facts of agent state information entered into an external database. In the cloning process, two agents on a host station sharing a common goal can be merged or married to compose a new agent. Application of the two-layer set of algorithms for mobile agent evolution, performed in a distributed processing environment, is made to the QoS management functions of the IP multimedia IM sub-network of the third generation 3G Wideband Code-division Multiple Access W-CDMA wireless network.

  4. The Reliability of Encounter Cards to Assess the CanMEDs Roles

    ERIC Educational Resources Information Center

    Sherbino, Jonathan; Kulasegaram, Kulamakan; Worster, Andrew; Norman, Geoffrey R.

    2013-01-01

    The purpose of this study was to determine the reliability of a computer-based encounter card (EC) to assess medical students during an emergency medicine rotation. From April 2011 to March 2012, multiple physicians assessed an entire medical school class during their emergency medicine rotation using the CanMEDS framework. At the end of an…

  5. Computer-Aided Engineering for Electric-Drive Vehicle Batteries (CAEBAT)

    Science.gov Websites

    Laboratory Battery Design LLC CD-adapco EC Power ESim Ford General Motors (GM) Johnson Controls, Inc battery modeling" April 2013: R. Spotnitz, Design and Simulation of Spirally-Wound, Lithium-Ion Cells ;Effect of Tab Design on Large-Format Li-ion Cell Performance," Journal of Power Sources 257 70-79

  6. 26 CFR 1.927(d)-1 - Other definitions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... that is unpaid on the day after the end of the normal payment period, over (B) The present value, as of... rate for the present value computation is simple interest at the short-term monthly Federal rate... period. The present value of a payment is calculated as follows: EC14NO91.143 P=present value of a...

  7. Multimedia Training Systems for the Elderly and the Impaired.

    ERIC Educational Resources Information Center

    Brown, I. C.; And Others

    Application of Computer-based systems to Training in Information Technology (ACT-IT, a project by a consortium of partners in the United Kingdom and the Irish Republic, is part of the TIDE program, an initiative of the European Community (EC) to make information technology more accessible to disabled and elderly people. This paper outlines the…

  8. Implementation of and Ada real-time executive: A case study

    NASA Technical Reports Server (NTRS)

    Laird, James D.; Burton, Bruce A.; Koppes, Mary R.

    1986-01-01

    Current Ada language implementations and runtime environments are immature, unproven and are a key risk area for real-time embedded computer system (ECS). A test-case environment is provided in which the concerns of the real-time, ECS community are addressed. A priority driven executive is selected to be implemented in the Ada programming language. The model selected is representative of real-time executives tailored for embedded systems used missile, spacecraft, and avionics applications. An Ada-based design methodology is utilized, and two designs are considered. The first of these designs requires the use of vendor supplied runtime and tasking support. An alternative high-level design is also considered for an implementation requiring no vendor supplied runtime or tasking support. The former approach is carried through to implementation.

  9. Simulation-based performance analysis of EC-Earth 3.2.0 using Dimemas

    NASA Astrophysics Data System (ADS)

    Yepes Arbós, Xavier; César Acosta Cobos, Mario; Serradell Maronda, Kim; Sanchez Lorente, Alicia; Doblas Reyes, Francisco Javier

    2017-04-01

    Earth System Models (ESMs) are complex applications executed in supercomputing facilities due to their high demand on computing resources. However, not all these models perform a good resources usage and the energy efficiency can be well below a minimum acceptable. One example is EC-Earth, a global coupled climate model which integrates different component models to simulate the Earth system. The two main components used in this analysis are IFS as atmospheric model and NEMO as ocean model, both coupled via the OASIS3-MCT coupler. Preliminary results proved that EC-Earth does not have a good computational performance. For example, the scalability of this model using the T255L91 grid with 512 MPI processes for IFS and the ORCA1L75 grid with 128 MPI processes for NEMO achieves 40.3 of speedup. This means that the 81.2% of the resources are wasted. Therefore, it is necessary a performance analysis to find the bottlenecks of the model and thus, determine the most appropriate optimization techniques. Using traces of the model collected with profiling tools such as Extrae, Paraver and Dimemas, allow us to simulate the model behaviour on a configurable parallel platform and extrapolate the impact of hardware changes in the performance of EC-Earth. In this document we propose a state-of-art procedure which makes possible to evaluate the different characteristics of climate models in a very efficient way. Accordingly, the performance of EC-Earth in different scenarios, namely assuming an ideal machine, model sensitivity and limiting model due to coupling has been shown. By simulating these scenarios, we realized that each model has different characteristics. With the ideal machine, we have seen that there are some sources of inefficiency: about a 20.59% of the execution time is communication; and there are workload imbalances produced by data dependences both between IFS and NEMO and within each model. In addition, in the model sensitivity simulations, we have described the types of messages and detected data dependencies. In IFS, we have observed that latency affects the coupling between models due to a large amount of small communications, whereas bandwidth affects another region of the code with a few big messages. In NEMO, results show that the simulated latencies and bandwidths only affect slightly to its execution time. However, it has data dependencies solved inefficiently and workload imbalances. The last simulation performed to detect the slowest model due to coupling has revealed that IFS is slower than NEMO. Moreover, there is not enough bandwidth to transfer all the data in IFS, whereas in NEMO there is almost no contention. This study is useful to improve the computational efficiency of the model, adapt it to support ultra-high resolution (UHR) experiments and future exascale supercomputers, and help code developers to design new algorithms more machine-independent.

  10. Evolutionary model selection and parameter estimation for protein-protein interaction network based on differential evolution algorithm

    PubMed Central

    Huang, Lei; Liao, Li; Wu, Cathy H.

    2016-01-01

    Revealing the underlying evolutionary mechanism plays an important role in understanding protein interaction networks in the cell. While many evolutionary models have been proposed, the problem about applying these models to real network data, especially for differentiating which model can better describe evolutionary process for the observed network urgently remains as a challenge. The traditional way is to use a model with presumed parameters to generate a network, and then evaluate the fitness by summary statistics, which however cannot capture the complete network structures information and estimate parameter distribution. In this work we developed a novel method based on Approximate Bayesian Computation and modified Differential Evolution (ABC-DEP) that is capable of conducting model selection and parameter estimation simultaneously and detecting the underlying evolutionary mechanisms more accurately. We tested our method for its power in differentiating models and estimating parameters on the simulated data and found significant improvement in performance benchmark, as compared with a previous method. We further applied our method to real data of protein interaction networks in human and yeast. Our results show Duplication Attachment model as the predominant evolutionary mechanism for human PPI networks and Scale-Free model as the predominant mechanism for yeast PPI networks. PMID:26357273

  11. Inferring population history with DIY ABC: a user-friendly approach to approximate Bayesian computation

    PubMed Central

    Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A.; Robert, Christian P.; Marin, Jean-Michel; Balding, David J.; Guillemaud, Thomas; Estoup, Arnaud

    2008-01-01

    Summary: Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. Availability: The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc. Contact: j.cornuet@imperial.ac.uk Supplementary information: Supplementary data are also available at http://www.montpellier.inra.fr/CBGP/diyabc PMID:18842597

  12. Evolutionary Study of Interethnic Cooperation

    NASA Astrophysics Data System (ADS)

    Kvasnicka, Vladimir; Pospichal, Jiri

    The purpose of this communication is to present an evolutionary study of cooperation between two ethnic groups. The used model is stimulated by the seminal paper of J. D. Fearon and D. D. Laitin (Explaining Interethnic Cooperation, American Political Science Review, 90 (1996), pp. 715-735), where the iterated prisoner's dilemma was used to model intra- and interethnic interactions. We reformulated their approach in a form of evolutionary prisoner's dilemma method, where a population of strategies is evolved by applying simple reproduction process with a Darwin metaphor of natural selection (a probability of selection to the reproduction is proportional to a fitness). Our computer simulations show that an application of a principle of collective guilt does not lead to an emergence of an interethnic cooperation. When an administrator is introduced, then an emergence of interethnic cooperation may be observed. Furthermore, if the ethnic groups are of very different sizes, then the principle of collective guilt may be very devastating for smaller group so that intraethnic cooperation is destroyed. The second strategy of cooperation is called the personal responsibility, where agents that defected within interethnic interactions are punished inside of their ethnic groups. It means, unlikely to the principle of collective guilt, that there exists only one type of punishment, loosely speaking, agents are punished "personally." All the substantial computational results were checked and interpreted analytically within the theory of evolutionary stable strategies. Moreover, this theoretical approach offers mechanisms of simple scenarios explaining why some particular strategies are stable or not.

  13. Pareto-optimal phylogenetic tree reconciliation

    PubMed Central

    Libeskind-Hadas, Ran; Wu, Yi-Chieh; Bansal, Mukul S.; Kellis, Manolis

    2014-01-01

    Motivation: Phylogenetic tree reconciliation is a widely used method for reconstructing the evolutionary histories of gene families and species, hosts and parasites and other dependent pairs of entities. Reconciliation is typically performed using maximum parsimony, in which each evolutionary event type is assigned a cost and the objective is to find a reconciliation of minimum total cost. It is generally understood that reconciliations are sensitive to event costs, but little is understood about the relationship between event costs and solutions. Moreover, choosing appropriate event costs is a notoriously difficult problem. Results: We address this problem by giving an efficient algorithm for computing Pareto-optimal sets of reconciliations, thus providing the first systematic method for understanding the relationship between event costs and reconciliations. This, in turn, results in new techniques for computing event support values and, for cophylogenetic analyses, performing robust statistical tests. We provide new software tools and demonstrate their use on a number of datasets from evolutionary genomic and cophylogenetic studies. Availability and implementation: Our Python tools are freely available at www.cs.hmc.edu/∼hadas/xscape. Contact: mukul@engr.uconn.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24932009

  14. Computational Evolutionary Methodology for Knowledge Discovery and Forecasting in Epidemiology and Medicine

    NASA Astrophysics Data System (ADS)

    Rao, Dhananjai M.; Chernyakhovsky, Alexander; Rao, Victoria

    2008-05-01

    Humanity is facing an increasing number of highly virulent and communicable diseases such as avian influenza. Researchers believe that avian influenza has potential to evolve into one of the deadliest pandemics. Combating these diseases requires in-depth knowledge of their epidemiology. An effective methodology for discovering epidemiological knowledge is to utilize a descriptive, evolutionary, ecological model and use bio-simulations to study and analyze it. These types of bio-simulations fall under the category of computational evolutionary methods because the individual entities participating in the simulation are permitted to evolve in a natural manner by reacting to changes in the simulated ecosystem. This work describes the application of the aforementioned methodology to discover epidemiological knowledge about avian influenza using a novel eco-modeling and bio-simulation environment called SEARUMS. The mathematical principles underlying SEARUMS, its design, and the procedure for using SEARUMS are discussed. The bio-simulations and multi-faceted case studies conducted using SEARUMS elucidate its ability to pinpoint timelines, epicenters, and socio-economic impacts of avian influenza. This knowledge is invaluable for proactive deployment of countermeasures in order to minimize negative socioeconomic impacts, combat the disease, and avert a pandemic.

  15. Computational Evolutionary Methodology for Knowledge Discovery and Forecasting in Epidemiology and Medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Dhananjai M.; Chernyakhovsky, Alexander; Rao, Victoria

    2008-05-08

    Humanity is facing an increasing number of highly virulent and communicable diseases such as avian influenza. Researchers believe that avian influenza has potential to evolve into one of the deadliest pandemics. Combating these diseases requires in-depth knowledge of their epidemiology. An effective methodology for discovering epidemiological knowledge is to utilize a descriptive, evolutionary, ecological model and use bio-simulations to study and analyze it. These types of bio-simulations fall under the category of computational evolutionary methods because the individual entities participating in the simulation are permitted to evolve in a natural manner by reacting to changes in the simulated ecosystem. Thismore » work describes the application of the aforementioned methodology to discover epidemiological knowledge about avian influenza using a novel eco-modeling and bio-simulation environment called SEARUMS. The mathematical principles underlying SEARUMS, its design, and the procedure for using SEARUMS are discussed. The bio-simulations and multi-faceted case studies conducted using SEARUMS elucidate its ability to pinpoint timelines, epicenters, and socio-economic impacts of avian influenza. This knowledge is invaluable for proactive deployment of countermeasures in order to minimize negative socioeconomic impacts, combat the disease, and avert a pandemic.« less

  16. The tangled bank of amino acids.

    PubMed

    Goldstein, Richard A; Pollock, David D

    2016-07-01

    The use of amino acid substitution matrices to model protein evolution has yielded important insights into both the evolutionary process and the properties of specific protein families. In order to make these models tractable, standard substitution matrices represent the average results of the evolutionary process rather than the underlying molecular biophysics and population genetics, treating proteins as a set of independently evolving sites rather than as an integrated biomolecular entity. With advances in computing and the increasing availability of sequence data, we now have an opportunity to move beyond current substitution matrices to more interpretable mechanistic models with greater fidelity to the evolutionary process of mutation and selection and the holistic nature of the selective constraints. As part of this endeavour, we consider how epistatic interactions induce spatial and temporal rate heterogeneity, and demonstrate how these generally ignored factors can reconcile standard substitution rate matrices and the underlying biology, allowing us to better understand the meaning of these substitution rates. Using computational simulations of protein evolution, we can demonstrate the importance of both spatial and temporal heterogeneity in modelling protein evolution. © 2016 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.

  17. Evolutionary view of acyl-CoA diacylglycerol acyltransferase (DGAT), a key enzyme in neutral lipid biosynthesis.

    PubMed

    Turchetto-Zolet, Andreia C; Maraschin, Felipe S; de Morais, Guilherme L; Cagliari, Alexandro; Andrade, Cláudia M B; Margis-Pinheiro, Marcia; Margis, Rogerio

    2011-09-20

    Triacylglycerides (TAGs) are a class of neutral lipids that represent the most important storage form of energy for eukaryotic cells. DGAT (acyl-CoA: diacylglycerol acyltransferase; EC 2.3.1.20) is a transmembrane enzyme that acts in the final and committed step of TAG synthesis, and it has been proposed to be the rate-limiting enzyme in plant storage lipid accumulation. In fact, two different enzymes identified in several eukaryotic species, DGAT1 and DGAT2, are the main enzymes responsible for TAG synthesis. These enzymes do not share high DNA or protein sequence similarities, and it has been suggested that they play non-redundant roles in different tissues and in some species in TAG synthesis. Despite a number of previous studies on the DGAT1 and DGAT2 genes, which have emphasized their importance as potential obesity treatment targets to increase triacylglycerol accumulation, little is known about their evolutionary timeline in eukaryotes. The goal of this study was to examine the evolutionary relationship of the DGAT1 and DGAT2 genes across eukaryotic organisms in order to infer their origin. We have conducted a broad survey of fully sequenced genomes, including representatives of Amoebozoa, yeasts, fungi, algae, musses, plants, vertebrate and invertebrate species, for the presence of DGAT1 and DGAT2 gene homologs. We found that the DGAT1 and DGAT2 genes are nearly ubiquitous in eukaryotes and are readily identifiable in all the major eukaryotic groups and genomes examined. Phylogenetic analyses of the DGAT1 and DGAT2 amino acid sequences revealed evolutionary partitioning of the DGAT protein family into two major DGAT1 and DGAT2 clades. Protein secondary structure and hydrophobic-transmembrane analysis also showed differences between these enzymes. The analysis also revealed that the MGAT2 and AWAT genes may have arisen from DGAT2 duplication events. In this study, we identified several DGAT1 and DGAT2 homologs in eukaryote taxa. Overall, the data show that DGAT1 and DGAT2 are present in most eukaryotic organisms and belong to two different gene families. The phylogenetic and evolutionary analyses revealed that DGAT1 and DGAT2 evolved separately, with functional convergence, despite their wide molecular and structural divergence.

  18. Evolutionary view of acyl-CoA diacylglycerol acyltransferase (DGAT), a key enzyme in neutral lipid biosynthesis

    PubMed Central

    2011-01-01

    Background Triacylglycerides (TAGs) are a class of neutral lipids that represent the most important storage form of energy for eukaryotic cells. DGAT (acyl-CoA: diacylglycerol acyltransferase; EC 2.3.1.20) is a transmembrane enzyme that acts in the final and committed step of TAG synthesis, and it has been proposed to be the rate-limiting enzyme in plant storage lipid accumulation. In fact, two different enzymes identified in several eukaryotic species, DGAT1 and DGAT2, are the main enzymes responsible for TAG synthesis. These enzymes do not share high DNA or protein sequence similarities, and it has been suggested that they play non-redundant roles in different tissues and in some species in TAG synthesis. Despite a number of previous studies on the DGAT1 and DGAT2 genes, which have emphasized their importance as potential obesity treatment targets to increase triacylglycerol accumulation, little is known about their evolutionary timeline in eukaryotes. The goal of this study was to examine the evolutionary relationship of the DGAT1 and DGAT2 genes across eukaryotic organisms in order to infer their origin. Results We have conducted a broad survey of fully sequenced genomes, including representatives of Amoebozoa, yeasts, fungi, algae, musses, plants, vertebrate and invertebrate species, for the presence of DGAT1 and DGAT2 gene homologs. We found that the DGAT1 and DGAT2 genes are nearly ubiquitous in eukaryotes and are readily identifiable in all the major eukaryotic groups and genomes examined. Phylogenetic analyses of the DGAT1 and DGAT2 amino acid sequences revealed evolutionary partitioning of the DGAT protein family into two major DGAT1 and DGAT2 clades. Protein secondary structure and hydrophobic-transmembrane analysis also showed differences between these enzymes. The analysis also revealed that the MGAT2 and AWAT genes may have arisen from DGAT2 duplication events. Conclusions In this study, we identified several DGAT1 and DGAT2 homologs in eukaryote taxa. Overall, the data show that DGAT1 and DGAT2 are present in most eukaryotic organisms and belong to two different gene families. The phylogenetic and evolutionary analyses revealed that DGAT1 and DGAT2 evolved separately, with functional convergence, despite their wide molecular and structural divergence. PMID:21933415

  19. Evolving binary classifiers through parallel computation of multiple fitness cases.

    PubMed

    Cagnoni, Stefano; Bergenti, Federico; Mordonini, Monica; Adorni, Giovanni

    2005-06-01

    This paper describes two versions of a novel approach to developing binary classifiers, based on two evolutionary computation paradigms: cellular programming and genetic programming. Such an approach achieves high computation efficiency both during evolution and at runtime. Evolution speed is optimized by allowing multiple solutions to be computed in parallel. Runtime performance is optimized explicitly using parallel computation in the case of cellular programming or implicitly taking advantage of the intrinsic parallelism of bitwise operators on standard sequential architectures in the case of genetic programming. The approach was tested on a digit recognition problem and compared with a reference classifier.

  20. Computer-facilitated rapid HIV testing in emergency care settings: provider and patient usability and acceptability.

    PubMed

    Spielberg, Freya; Kurth, Ann E; Severynen, Anneleen; Hsieh, Yu-Hsiang; Moring-Parris, Daniel; Mackenzie, Sara; Rothman, Richard

    2011-06-01

    Providers in emergency care settings (ECSs) often face barriers to expanded HIV testing. We undertook formative research to understand the potential utility of a computer tool, "CARE," to facilitate rapid HIV testing in ECSs. Computer tool usability and acceptability were assessed among 35 adult patients, and provider focus groups were held, in two ECSs in Washington State and Maryland. The computer tool was usable by patients of varying computer literacy. Patients appreciated the tool's privacy and lack of judgment and their ability to reflect on HIV risks and create risk reduction plans. Staff voiced concerns regarding ECS-based HIV testing generally, including resources for follow-up of newly diagnosed people. Computer-delivered HIV testing support was acceptable and usable among low-literacy populations in two ECSs. Such tools may help circumvent some practical barriers associated with routine HIV testing in busy settings though linkages to care will still be needed.

  1. Computational Intelligence and Its Impact on Future High-Performance Engineering Systems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler)

    1996-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Intelligence held at the Virginia Consortium of Engineering and Science Universities, Hampton, Virginia, June 27-28, 1995. The presentations addressed activities in the areas of fuzzy logic, neural networks, and evolutionary computations. Workshop attendees represented NASA, the National Science Foundation, the Department of Energy, National Institute of Standards and Technology (NIST), the Jet Propulsion Laboratory, industry, and academia. The workshop objectives were to assess the state of technology in the Computational intelligence area and to provide guidelines for future research.

  2. Explicit Building Block Multiobjective Evolutionary Computation: Methods and Applications

    DTIC Science & Technology

    2005-06-16

    which is introduced in 1990 by Richard Dawkins in his book ”The Selfish Gene .” [34] 356 E.5.7 Pareto Envelop-based Selection Algorithm I and II...IGC Intelligent Gene Collector . . . . . . . . . . . . . . . . . 59 OED Orthogonal Experimental Design . . . . . . . . . . . . . 59 MED Main Effect...complete one experiment 74 `′ The string length hold within the computer (can be longer than number of genes

  3. The Application of Multiobjective Evolutionary Algorithms to an Educational Computational Model of Science Information Processing: A Computational Experiment in Science Education

    ERIC Educational Resources Information Center

    Lamb, Richard L.; Firestone, Jonah B.

    2017-01-01

    Conflicting explanations and unrelated information in science classrooms increase cognitive load and decrease efficiency in learning. This reduced efficiency ultimately limits one's ability to solve reasoning problems in the science. In reasoning, it is the ability of students to sift through and identify critical pieces of information that is of…

  4. Growth Control and Disease Mechanisms in Computational Embryogeny

    NASA Technical Reports Server (NTRS)

    Shapiro, Andrew A.; Yogev, Or; Antonsson, Erik K.

    2008-01-01

    This paper presents novel approach to applying growth control and diseases mechanisms in computational embryogeny. Our method, which mimics fundamental processes from biology, enables individuals to reach maturity in a controlled process through a stochastic environment. Three different mechanisms were implemented; disease mechanisms, gene suppression, and thermodynamic balancing. This approach was integrated as part of a structural evolutionary model. The model evolved continuum 3-D structures which support an external load. By using these mechanisms we were able to evolve individuals that reached a fixed size limit through the growth process. The growth process was an integral part of the complete development process. The size of the individuals was determined purely by the evolutionary process where different individuals matured to different sizes. Individuals which evolved with these characteristics have been found to be very robust for supporting a wide range of external loads.

  5. Can An Evolutionary Process Create English Text?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, David H.

    Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed tomore » produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).« less

  6. Integration of electromagnetic induction sensor data in soil sampling scheme optimization using simulated annealing.

    PubMed

    Barca, E; Castrignanò, A; Buttafuoco, G; De Benedetto, D; Passarella, G

    2015-07-01

    Soil survey is generally time-consuming, labor-intensive, and costly. Optimization of sampling scheme allows one to reduce the number of sampling points without decreasing or even increasing the accuracy of investigated attribute. Maps of bulk soil electrical conductivity (EC a ) recorded with electromagnetic induction (EMI) sensors could be effectively used to direct soil sampling design for assessing spatial variability of soil moisture. A protocol, using a field-scale bulk EC a survey, has been applied in an agricultural field in Apulia region (southeastern Italy). Spatial simulated annealing was used as a method to optimize spatial soil sampling scheme taking into account sampling constraints, field boundaries, and preliminary observations. Three optimization criteria were used. the first criterion (minimization of mean of the shortest distances, MMSD) optimizes the spreading of the point observations over the entire field by minimizing the expectation of the distance between an arbitrarily chosen point and its nearest observation; the second criterion (minimization of weighted mean of the shortest distances, MWMSD) is a weighted version of the MMSD, which uses the digital gradient of the grid EC a data as weighting function; and the third criterion (mean of average ordinary kriging variance, MAOKV) minimizes mean kriging estimation variance of the target variable. The last criterion utilizes the variogram model of soil water content estimated in a previous trial. The procedures, or a combination of them, were tested and compared in a real case. Simulated annealing was implemented by the software MSANOS able to define or redesign any sampling scheme by increasing or decreasing the original sampling locations. The output consists of the computed sampling scheme, the convergence time, and the cooling law, which can be an invaluable support to the process of sampling design. The proposed approach has found the optimal solution in a reasonable computation time. The use of bulk EC a gradient as an exhaustive variable, known at any node of an interpolation grid, has allowed the optimization of the sampling scheme, distinguishing among areas with different priority levels.

  7. Quantitative comparison of catalytic mechanisms and overall reactions in convergently evolved enzymes: implications for classification of enzyme function.

    PubMed

    Almonacid, Daniel E; Yera, Emmanuel R; Mitchell, John B O; Babbitt, Patricia C

    2010-03-12

    Functionally analogous enzymes are those that catalyze similar reactions on similar substrates but do not share common ancestry, providing a window on the different structural strategies nature has used to evolve required catalysts. Identification and use of this information to improve reaction classification and computational annotation of enzymes newly discovered in the genome projects would benefit from systematic determination of reaction similarities. Here, we quantified similarity in bond changes for overall reactions and catalytic mechanisms for 95 pairs of functionally analogous enzymes (non-homologous enzymes with identical first three numbers of their EC codes) from the MACiE database. Similarity of overall reactions was computed by comparing the sets of bond changes in the transformations from substrates to products. For similarity of mechanisms, sets of bond changes occurring in each mechanistic step were compared; these similarities were then used to guide global and local alignments of mechanistic steps. Using this metric, only 44% of pairs of functionally analogous enzymes in the dataset had significantly similar overall reactions. For these enzymes, convergence to the same mechanism occurred in 33% of cases, with most pairs having at least one identical mechanistic step. Using our metric, overall reaction similarity serves as an upper bound for mechanistic similarity in functional analogs. For example, the four carbon-oxygen lyases acting on phosphates (EC 4.2.3) show neither significant overall reaction similarity nor significant mechanistic similarity. By contrast, the three carboxylic-ester hydrolases (EC 3.1.1) catalyze overall reactions with identical bond changes and have converged to almost identical mechanisms. The large proportion of enzyme pairs that do not show significant overall reaction similarity (56%) suggests that at least for the functionally analogous enzymes studied here, more stringent criteria could be used to refine definitions of EC sub-subclasses for improved discrimination in their classification of enzyme reactions. The results also indicate that mechanistic convergence of reaction steps is widespread, suggesting that quantitative measurement of mechanistic similarity can inform approaches for functional annotation.

  8. Quantitative Comparison of Catalytic Mechanisms and Overall Reactions in Convergently Evolved Enzymes: Implications for Classification of Enzyme Function

    PubMed Central

    Almonacid, Daniel E.; Yera, Emmanuel R.; Mitchell, John B. O.; Babbitt, Patricia C.

    2010-01-01

    Functionally analogous enzymes are those that catalyze similar reactions on similar substrates but do not share common ancestry, providing a window on the different structural strategies nature has used to evolve required catalysts. Identification and use of this information to improve reaction classification and computational annotation of enzymes newly discovered in the genome projects would benefit from systematic determination of reaction similarities. Here, we quantified similarity in bond changes for overall reactions and catalytic mechanisms for 95 pairs of functionally analogous enzymes (non-homologous enzymes with identical first three numbers of their EC codes) from the MACiE database. Similarity of overall reactions was computed by comparing the sets of bond changes in the transformations from substrates to products. For similarity of mechanisms, sets of bond changes occurring in each mechanistic step were compared; these similarities were then used to guide global and local alignments of mechanistic steps. Using this metric, only 44% of pairs of functionally analogous enzymes in the dataset had significantly similar overall reactions. For these enzymes, convergence to the same mechanism occurred in 33% of cases, with most pairs having at least one identical mechanistic step. Using our metric, overall reaction similarity serves as an upper bound for mechanistic similarity in functional analogs. For example, the four carbon-oxygen lyases acting on phosphates (EC 4.2.3) show neither significant overall reaction similarity nor significant mechanistic similarity. By contrast, the three carboxylic-ester hydrolases (EC 3.1.1) catalyze overall reactions with identical bond changes and have converged to almost identical mechanisms. The large proportion of enzyme pairs that do not show significant overall reaction similarity (56%) suggests that at least for the functionally analogous enzymes studied here, more stringent criteria could be used to refine definitions of EC sub-subclasses for improved discrimination in their classification of enzyme reactions. The results also indicate that mechanistic convergence of reaction steps is widespread, suggesting that quantitative measurement of mechanistic similarity can inform approaches for functional annotation. PMID:20300652

  9. The Diagnostic Utility of Computer-Assisted Auscultation for the Early Detection of Cardiac Murmurs of Structural Origin in the Periodic Health Evaluation

    PubMed Central

    Viviers, Pierre L.; Kirby, Jo-Anne H.; Viljoen, Jeandré T.; Derman, Wayne

    2017-01-01

    Background: Identification of the nature of cardiac murmurs during the periodic health evaluation (PHE) of athletes is challenging due to the difficulty in distinguishing between murmurs of physiological or structural origin. Previously, computer-assisted auscultation (CAA) has shown promise to support appropriate referrals in the nonathlete pediatric population. Hypothesis: CAA has the ability to accurately detect cardiac murmurs of structural origin during a PHE in collegiate athletes. Study Design: Cross-sectional, descriptive study. Level of Evidence: Level 3. Methods: A total of 131 collegiate athletes (104 men, 28 women; mean age, 20 ± 2 years) completed a sports physician (SP)–driven PHE consisting of a cardiac history questionnaire and a physical examination. An independent CAA assessment was performed by a technician who was blinded to the SP findings. Athletes with suspected structural murmurs or other clinical reasons for concern were referred to a cardiologist for confirmatory echocardiography (EC). Results: Twenty-five athletes were referred for further investigation (17 murmurs, 6 abnormal electrocardiographs, 1 displaced apex, and 1 possible case of Marfan syndrome). EC confirmed 3 structural and 22 physiological murmurs. The SP flagged 5 individuals with possible underlying structural pathology; 2 of these murmurs were confirmed as structural in nature. Fourteen murmurs were referred by CAA; 3 of these were confirmed as structural in origin by EC. One such murmur was not detected by the SP, however, and detected by CAA. The sensitivity of CAA was 100% compared with 66.7% shown by the SP, while specificity was 50% and 66.7%, respectively. Conclusion: CAA shows potential to be a feasible adjunct for improving the identification of structural murmurs in the athlete population. Over-referral by CAA for EC requires further investigation and possible refinements to the current algorithm. Further studies are needed to determine the true sensitivity, specificity, and cost efficacy of the device among the athletic population. Clinical Relevance: CAA may be a useful cardiac screening adjunct during the PHE of athletes, particularly as it may guide appropriate referral of suspected structural murmurs for further investigation. PMID:28661830

  10. The Diagnostic Utility of Computer-Assisted Auscultation for the Early Detection of Cardiac Murmurs of Structural Origin in the Periodic Health Evaluation.

    PubMed

    Viviers, Pierre L; Kirby, Jo-Anne H; Viljoen, Jeandré T; Derman, Wayne

    Identification of the nature of cardiac murmurs during the periodic health evaluation (PHE) of athletes is challenging due to the difficulty in distinguishing between murmurs of physiological or structural origin. Previously, computer-assisted auscultation (CAA) has shown promise to support appropriate referrals in the nonathlete pediatric population. CAA has the ability to accurately detect cardiac murmurs of structural origin during a PHE in collegiate athletes. Cross-sectional, descriptive study. Level 3. A total of 131 collegiate athletes (104 men, 28 women; mean age, 20 ± 2 years) completed a sports physician (SP)-driven PHE consisting of a cardiac history questionnaire and a physical examination. An independent CAA assessment was performed by a technician who was blinded to the SP findings. Athletes with suspected structural murmurs or other clinical reasons for concern were referred to a cardiologist for confirmatory echocardiography (EC). Twenty-five athletes were referred for further investigation (17 murmurs, 6 abnormal electrocardiographs, 1 displaced apex, and 1 possible case of Marfan syndrome). EC confirmed 3 structural and 22 physiological murmurs. The SP flagged 5 individuals with possible underlying structural pathology; 2 of these murmurs were confirmed as structural in nature. Fourteen murmurs were referred by CAA; 3 of these were confirmed as structural in origin by EC. One such murmur was not detected by the SP, however, and detected by CAA. The sensitivity of CAA was 100% compared with 66.7% shown by the SP, while specificity was 50% and 66.7%, respectively. CAA shows potential to be a feasible adjunct for improving the identification of structural murmurs in the athlete population. Over-referral by CAA for EC requires further investigation and possible refinements to the current algorithm. Further studies are needed to determine the true sensitivity, specificity, and cost efficacy of the device among the athletic population. CAA may be a useful cardiac screening adjunct during the PHE of athletes, particularly as it may guide appropriate referral of suspected structural murmurs for further investigation.

  11. Evolution and Vaccination of Influenza Virus.

    PubMed

    Lam, Ham Ching; Bi, Xuan; Sreevatsan, Srinand; Boley, Daniel

    2017-08-01

    In this study, we present an application paradigm in which an unsupervised machine learning approach is applied to the high-dimensional influenza genetic sequences to investigate whether vaccine is a driving force to the evolution of influenza virus. We first used a visualization approach to visualize the evolutionary paths of vaccine-controlled and non-vaccine-controlled influenza viruses in a low-dimensional space. We then quantified the evolutionary differences between their evolutionary trajectories through the use of within- and between-scatter matrices computation to provide the statistical confidence to support the visualization results. We used the influenza surface Hemagglutinin (HA) gene for this study as the HA gene is the major target of the immune system. The visualization is achieved without using any clustering methods or prior information about the influenza sequences. Our results clearly showed that the evolutionary trajectories between vaccine-controlled and non-vaccine-controlled influenza viruses are different and vaccine as an evolution driving force cannot be completely eliminated.

  12. The evolutionary dynamics of language.

    PubMed

    Steels, Luc; Szathmáry, Eörs

    2018-02-01

    The well-established framework of evolutionary dynamics can be applied to the fascinating open problems how human brains are able to acquire and adapt language and how languages change in a population. Schemas for handling grammatical constructions are the replicating unit. They emerge and multiply with variation in the brains of individuals and undergo selection based on their contribution to needed expressive power, communicative success and the reduction of cognitive effort. Adopting this perspective has two major benefits. (i) It makes a bridge to neurobiological models of the brain that have also adopted an evolutionary dynamics point of view, thus opening a new horizon for studying how human brains achieve the remarkably complex competence for language. And (ii) it suggests a new foundation for studying cultural language change as an evolutionary dynamics process. The paper sketches this novel perspective, provides references to empirical data and computational experiments, and points to open problems. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Theoretical Foundation of the RelTime Method for Estimating Divergence Times from Variable Evolutionary Rates

    PubMed Central

    Tamura, Koichiro; Tao, Qiqing; Kumar, Sudhir

    2018-01-01

    Abstract RelTime estimates divergence times by relaxing the assumption of a strict molecular clock in a phylogeny. It shows excellent performance in estimating divergence times for both simulated and empirical molecular sequence data sets in which evolutionary rates varied extensively throughout the tree. RelTime is computationally efficient and scales well with increasing size of data sets. Until now, however, RelTime has not had a formal mathematical foundation. Here, we show that the basis of the RelTime approach is a relative rate framework (RRF) that combines comparisons of evolutionary rates in sister lineages with the principle of minimum rate change between evolutionary lineages and their respective descendants. We present analytical solutions for estimating relative lineage rates and divergence times under RRF. We also discuss the relationship of RRF with other approaches, including the Bayesian framework. We conclude that RelTime will be useful for phylogenies with branch lengths derived not only from molecular data, but also morphological and biochemical traits. PMID:29893954

  14. Evolutionary Algorithms for Boolean Functions in Diverse Domains of Cryptography.

    PubMed

    Picek, Stjepan; Carlet, Claude; Guilley, Sylvain; Miller, Julian F; Jakobovic, Domagoj

    2016-01-01

    The role of Boolean functions is prominent in several areas including cryptography, sequences, and coding theory. Therefore, various methods for the construction of Boolean functions with desired properties are of direct interest. New motivations on the role of Boolean functions in cryptography with attendant new properties have emerged over the years. There are still many combinations of design criteria left unexplored and in this matter evolutionary computation can play a distinct role. This article concentrates on two scenarios for the use of Boolean functions in cryptography. The first uses Boolean functions as the source of the nonlinearity in filter and combiner generators. Although relatively well explored using evolutionary algorithms, it still presents an interesting goal in terms of the practical sizes of Boolean functions. The second scenario appeared rather recently where the objective is to find Boolean functions that have various orders of the correlation immunity and minimal Hamming weight. In both these scenarios we see that evolutionary algorithms are able to find high-quality solutions where genetic programming performs the best.

  15. Evolutionary Construction of Block-Based Neural Networks in Consideration of Failure

    NASA Astrophysics Data System (ADS)

    Takamori, Masahito; Koakutsu, Seiichi; Hamagami, Tomoki; Hirata, Hironori

    In this paper we propose a modified gene coding and an evolutionary construction in consideration of failure in evolutionary construction of Block-Based Neural Networks. In the modified gene coding, we arrange the genes of weights on a chromosome in consideration of the position relation of the genes of weight and structure. By the modified gene coding, the efficiency of search by crossover is increased. Thereby, it is thought that improvement of the convergence rate of construction and shortening of construction time can be performed. In the evolutionary construction in consideration of failure, the structure which is adapted for failure is built in the state where failure occured. Thereby, it is thought that BBNN can be reconstructed in a short time at the time of failure. To evaluate the proposed method, we apply it to pattern classification and autonomous mobile robot control problems. The computational experiments indicate that the proposed method can improve convergence rate of construction and shorten of construction and reconstruction time.

  16. XTALOPT: An open-source evolutionary algorithm for crystal structure prediction

    NASA Astrophysics Data System (ADS)

    Lonie, David C.; Zurek, Eva

    2011-02-01

    The implementation and testing of XTALOPT, an evolutionary algorithm for crystal structure prediction, is outlined. We present our new periodic displacement (ripple) operator which is ideally suited to extended systems. It is demonstrated that hybrid operators, which combine two pure operators, reduce the number of duplicate structures in the search. This allows for better exploration of the potential energy surface of the system in question, while simultaneously zooming in on the most promising regions. A continuous workflow, which makes better use of computational resources as compared to traditional generation based algorithms, is employed. Various parameters in XTALOPT are optimized using a novel benchmarking scheme. XTALOPT is available under the GNU Public License, has been interfaced with various codes commonly used to study extended systems, and has an easy to use, intuitive graphical interface. Program summaryProgram title:XTALOPT Catalogue identifier: AEGX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL v2.1 or later [1] No. of lines in distributed program, including test data, etc.: 36 849 No. of bytes in distributed program, including test data, etc.: 1 149 399 Distribution format: tar.gz Programming language: C++ Computer: PCs, workstations, or clusters Operating system: Linux Classification: 7.7 External routines: QT [2], OpenBabel [3], AVOGADRO [4], SPGLIB [8] and one of: VASP [5], PWSCF [6], GULP [7]. Nature of problem: Predicting the crystal structure of a system from its stoichiometry alone remains a grand challenge in computational materials science, chemistry, and physics. Solution method: Evolutionary algorithms are stochastic search techniques which use concepts from biological evolution in order to locate the global minimum on their potential energy surface. Our evolutionary algorithm, XTALOPT, is freely available to the scientific community for use and collaboration under the GNU Public License. Running time: User dependent. The program runs until stopped by the user.

  17. Redundancy and Replication Help Make Your Systems Stress-Free

    ERIC Educational Resources Information Center

    Mitchell, Erik

    2011-01-01

    In mid-April, Amazon EC2 services had a small problem. Apparently, a large swath of its cloud computing environment had such substantial trouble that a number of customers had server issues. A number of high-profile sites, including Reddit, Evite, and Foursquare, went down when Amazon experienced issues in their US East 1a region (Justinb 2011).…

  18. Computer Aided Wirewrap Interconnect.

    DTIC Science & Technology

    1980-11-01

    ECLI (180 MHz System Clock Generated via Ring Oscillator) Clock Waveform: Synchronous Phase 0 Output Binary Counter: Power Plane Noie: (Loaded) LSB...LOGIC (ECL) (185 MHz System Clock Generated via Ring Oscillator) Clock Woveform Synchronous Phase 0 Output Binary Counter- Power Plane Voise (Loaded...High Speed .. ......... . 98 Clock Signals Into Logic Panels in a Multiboard System On-Eoard Clock Distribution Via Fanout .... ......... 102 Through

  19. Comparison of Expert-Based and Empirical Evaluation Methodologies in the Case of a CBL Environment: The ''Orestis'' Experience

    ERIC Educational Resources Information Center

    Karoulis, Athanasis; Demetriadis, Stavros; Pombortsis, Andreas

    2006-01-01

    This paper compares several interface evaluation methods applied in the case of a computer based learning (CBL) environment, during a longitudinal study performed in three European countries, Greece, Germany, and Holland, and within the framework of an EC funded Leonardo da Vinci program. The paper firstly considers the particularities of the CBL…

  20. An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment

    PubMed Central

    Muthurajan, Vinothkumar; Narayanasamy, Balaji

    2016-01-01

    Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric) mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function) provide minimum protection level compared to asymmetric key (RSA, AES, and ECC) schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA) to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT) regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation. PMID:26981584

  1. An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment.

    PubMed

    Muthurajan, Vinothkumar; Narayanasamy, Balaji

    2016-01-01

    Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric) mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function) provide minimum protection level compared to asymmetric key (RSA, AES, and ECC) schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA) to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT) regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.

  2. Stochastic acceleration of electrons. I - Effects of collisions in solar flares

    NASA Technical Reports Server (NTRS)

    Hamilton, Russell J.; Petrosian, Vahe

    1992-01-01

    Stochastic acceleration of thermal electrons to nonrelativistic energies is studied under solar flare conditions. We show that, in turbulent regions, electron-whistler wave interactions can result in the acceleration of electrons in times comparable to or shorter than the Coulomb collision time. The kinetic equation describing the evolution of the electron energy distribution including stochastic acceleration by whistlers and energy loss via Coulomb interactions is solved for an initial thermal electron energy spectrum. In general, the shape of the resulting electron distributions are characterized by the energy E(c) where systematic energy gain by turbulence equals energy loss due to Coulomb collisions. For energies less than E(c), the spectra are steep (quasi-thermal) whereas above E(c), the spectra are power laws. We find that hard X-ray spectra computed using the electron distributions obtained from our numerical simulations are able to explain the complex spectral shapes and variations observed in impulsive hard X-ray bursts. In particular, we show that the gradual steepening observed by Lin et al. (1981) could be due to a systematic increase in the density of the plasma (due to evaporation) and the increasing importance of collisions instead of the appearance of a superhot thermal component.

  3. The many places of frequency: evidence for a novel locus of the lexical frequency effect in word production.

    PubMed

    Knobel, Mark; Finkbeiner, Matthew; Caramazza, Alfonso

    2008-03-01

    The effect of lexical frequency on language-processing tasks is exceptionally reliable. For example, pictures with higher frequency names are named faster and more accurately than those with lower frequency names. Experiments with normal participants and patients strongly suggest that this production effect arises at the level of lexical access. Further work has suggested that within lexical access this effect arises at the level of lexical representations. Here we present patient E.C. who shows an effect of lexical frequency on his nonword error rate. The best explanation of his performance is that there is an additional locus of frequency at the interface of lexical and segmental representational levels. We confirm this hypothesis by showing that only computational models with frequency at this new locus can produce a similar error pattern to that of patient E.C. Finally, in an analysis of a large group of Italian patients, we show that there exist patients who replicate E.C.'s pattern of results and others who show the complementary pattern of frequency effects on semantic error rates. Our results combined with previous findings suggest that frequency plays a role throughout the process of lexical access.

  4. Original non-stationary eddy current imaging process for the evaluation of defects in metallic structures

    NASA Astrophysics Data System (ADS)

    Placko, Dominique; Bore, Thierry; Rivollet, Alain; Joubert, Pierre-Yves

    2015-10-01

    This paper deals with the problem of imaging defects in metallic structures through eddy current (EC) inspections, and proposes an original process for a possible tomographical crack evaluation. This process is based on a semi analytical modeling, called "distributed point source method" (DPSM) which is used to describe and equate the interactions between the implemented EC probes and the structure under test. Several steps will be successively described, illustrating the feasibility of this new imaging process dedicated to the quantitative evaluation of defects. The basic principles of this imaging process firstly consist in creating a 3D grid by meshing the volume potentially inspected by the sensor. As a result, a given number of elemental volumes (called voxels) are obtained. Secondly, the DPSM modeling is used to compute an image for all occurrences in which only one of the voxels has a different conductivity among all the other ones. The assumption consists to consider that a real defect may be truly represented by a superimposition of elemental voxels: the resulting accuracy will naturally depend on the density of space sampling. On other hand, the excitation device of the EC imager has the capability to be oriented in several directions, and driven by an excitation current at variable frequency. So, the simulation will be performed for several frequencies and directions of the eddy currents induced in the structure, which increases the signal entropy. All these results are merged in a so-called "observation matrix" containing all the probe/structure interaction configurations. This matrix is then used in an inversion scheme in order to perform the evaluation of the defect location and geometry. The modeled EC data provided by the DPSM are compared to the experimental images provided by an eddy current imager (ECI), implemented on aluminum plates containing some buried defects. In order to validate the proposed inversion process, we feed it with computed images of various acquisition configurations. Additive noise was added to the images so that they are more representative of actual EC data. In the case of simple notch type defects, for which the relative conductivity may only take two extreme values (1 or 0), a threshold was introduced on the inverted images, in a post processing step, taking advantage of a priori knowledge of the statistical properties of the restored images. This threshold allowed to enhance the image contrast and has contributed to eliminate both the residual noise and the pixels showing non-realistic values.

  5. Accuracy of computer-aided diagnosis based on narrow-band imaging endocytoscopy for diagnosing colorectal lesions: comparison with experts.

    PubMed

    Misawa, Masashi; Kudo, Shin-Ei; Mori, Yuichi; Takeda, Kenichi; Maeda, Yasuharu; Kataoka, Shinichi; Nakamura, Hiroki; Kudo, Toyoki; Wakamura, Kunihiko; Hayashi, Takemasa; Katagiri, Atsushi; Baba, Toshiyuki; Ishida, Fumio; Inoue, Haruhiro; Nimura, Yukitaka; Oda, Msahiro; Mori, Kensaku

    2017-05-01

    Real-time characterization of colorectal lesions during colonoscopy is important for reducing medical costs, given that the need for a pathological diagnosis can be omitted if the accuracy of the diagnostic modality is sufficiently high. However, it is sometimes difficult for community-based gastroenterologists to achieve the required level of diagnostic accuracy. In this regard, we developed a computer-aided diagnosis (CAD) system based on endocytoscopy (EC) to evaluate cellular, glandular, and vessel structure atypia in vivo. The purpose of this study was to compare the diagnostic ability and efficacy of this CAD system with the performances of human expert and trainee endoscopists. We developed a CAD system based on EC with narrow-band imaging that allowed microvascular evaluation without dye (ECV-CAD). The CAD algorithm was programmed based on texture analysis and provided a two-class diagnosis of neoplastic or non-neoplastic, with probabilities. We validated the diagnostic ability of the ECV-CAD system using 173 randomly selected EC images (49 non-neoplasms, 124 neoplasms). The images were evaluated by the CAD and by four expert endoscopists and three trainees. The diagnostic accuracies for distinguishing between neoplasms and non-neoplasms were calculated. ECV-CAD had higher overall diagnostic accuracy than trainees (87.8 vs 63.4%; [Formula: see text]), but similar to experts (87.8 vs 84.2%; [Formula: see text]). With regard to high-confidence cases, the overall accuracy of ECV-CAD was also higher than trainees (93.5 vs 71.7%; [Formula: see text]) and comparable to experts (93.5 vs 90.8%; [Formula: see text]). ECV-CAD showed better diagnostic accuracy than trainee endoscopists and was comparable to that of experts. ECV-CAD could thus be a powerful decision-making tool for less-experienced endoscopists.

  6. Using mushroom farm and anaerobic digestion wastewaters as supplemental fertilizer sources for growing container nursery stock in a closed system.

    PubMed

    Chong, C; Purvis, P; Lumis, G; Holbein, B E; Voroney, R P; Zhou, H; Liu, H-W; Alam, M Z

    2008-04-01

    Wastewaters from farm and composting operations are often rich in select nutrients that potentially can be reutilized in crop production. Liners of silverleaf dogwood (Cornus alba L. 'Argenteo-marginata'), common ninebark [Physocarpus opulifolius (L.) Maxim.], and Anthony Waterer spirea (Spiraeaxbumalda Burvénich 'Anthony Waterer') were grown in 6L containers filled with a bark-based commercial mix. Plants were fertigated daily via a computer-controlled multi-fertilizer injector with three recirculated fertilizer treatments: (1) a stock (control) solution with complete macro- and micro-nutrients, electrical conductivity (EC) 2.2 dS m(-1); (2) wastewater from a mushroom farm; and (3) process wastewater from anaerobic digestion of municipal solid waste. The wastewaters used in both treatments 2 and 3 were diluted with tap water, and the computer was programmed to amend, dispense and recirculate nutrients based on the same target EC as in treatment 1. For comparison, there was a traditional controlled-release fertilizer treatment [Nutryon 17-5-12 (17N-2P-10K) plus micro-nutrients topdressed at a rate of 39 g/plant, nutrients not recirculated]. All three species responded similarly to the three recirculated fertilizer treatments. Growth with the recirculated treatments was similar and significantly higher than that obtained with controlled-release fertilizer. Throughout the study, the EC measured in wastewater-derived nutrient solutions, and also in the container substrate, were similar or close to those of the control treatment, although there were small to large differences among individual major nutrients. There was no sign of nutrient deficiency or toxicity symptoms to the plants. Small to moderate excesses in concentrations of SO(4), Na, and/or Cl were physiologically tolerable to the species.

  7. Relationship Between Successful Extracranial-Intracranial Bypass Surgeries and Ischemic White Matter Hyperintensities.

    PubMed

    Nagm, Alhusain; Horiuchi, Tetsuyoshi; Ito, Kiyoshi; Hongo, Kazuhiro

    2016-07-01

    Few studies have described regression of white matter hyperintensities (WMHs); however, no studies have described their recurrence or fluctuation. Thus, we aimed to study the course of WMHs on fluid-attenuated inversion recovery (FLAIR) magnetic resonance image (MRI) after extracranial-intracranial (EC-IC) bypass surgery and its correlation with the clinical outcome. We enrolled perioperative FLAIR MRIs of 12 patients with WMHs who underwent EC-IC bypass surgeries because of ischemic-vascular stenosis with postoperative improvement of the cerebral blood flow confirmed by (123)I-iodoamphetamine single-photon emission computed tomography. Correlation between WMHs and cerebral blood flow was confirmed by perioperative single-photon emission computed tomography and diffusion-weighted imaging MRI. The WMHs were assessed visually with meticulous volumetric grading. Depending on postoperative changes among different grades, the WMHs course was determined to be improved, fluctuating, worsened, or unchanged. A statistical analysis was performed on the course of WMHs over time. Imaging analysis was done with FLAIR MRI in 12 patients. The course of WMHs over time was 41.7% improvement, 33.3% fluctuation, 16.7% unchanged, and 8.3% worsening of the deep WMHs. After unilateral bypass surgery, 80% of the improved WMHs occurred bilaterally. Among patients with improved clinical outcomes, 16.7% showed improvement and 33.3% showed fluctuation, whereas in patients with unchanged clinical outcomes, 25% showed improvement of their WMHs on follow-up FLAIR MRIs. This study might be considered the first step to find a relationship between successful EC-IC bypass surgeries and the course of ischemic WMHs. It could also open the door for further studies to make more solid conclusions. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Evolution, learning, and cognition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Y.C.

    1988-01-01

    The book comprises more than fifteen articles in the areas of neural networks and connectionist systems, classifier systems, adaptive network systems, genetic algorithm, cellular automata, artificial immune systems, evolutionary genetics, cognitive science, optical computing, combinatorial optimization, and cybernetics.

  9. Application of high technology in highway transportation.

    DOT National Transportation Integrated Search

    1985-01-01

    Highway and traffic engineering practice is rapidly changing as communications technology and computer systems are being adopted to facilitate the work of the practitioners and expand their capabilities. This field has been an evolutionary one since ...

  10. Comparative molecular field analysis of fenoterol derivatives: A platform towards highly selective and effective beta(2)-adrenergic receptor agonists.

    PubMed

    Jozwiak, Krzysztof; Woo, Anthony Yiu-Ho; Tanga, Mary J; Toll, Lawrence; Jimenez, Lucita; Kozocas, Joseph A; Plazinska, Anita; Xiao, Rui-Ping; Wainer, Irving W

    2010-01-15

    To use a previously developed CoMFA model to design a series of new structures of high selectivity and efficacy towards the beta(2)-adrenergic receptor. Out of 21 computationally designed structures 6 compounds were synthesized and characterized for beta(2)-AR binding affinities, subtype selectivities and functional activities. the best compound is (R,R)-4-methoxy-1-naphthylfelnoterol with K(i)beta(2)-AR=0.28microm, K(i)beta(1)-AR/K(i)beta(2)-AR=573, EC(50cAMP)=3.9nm, EC(50cardio)=16nm. The CoMFA model appears to be an effective predictor of the cardiomocyte contractility of the studied compounds which are targeted for use in congestive heart failure. Copyright 2009 Elsevier Ltd. All rights reserved.

  11. Conventional and Nuclear Medicine Imaging in Ectopic Cushing's Syndrome: A Systematic Review.

    PubMed

    Isidori, Andrea M; Sbardella, Emilia; Zatelli, Maria Chiara; Boschetti, Mara; Vitale, Giovanni; Colao, Annamaria; Pivonello, Rosario

    2015-09-01

    Ectopic Cushing's Syndrome (ECS) can be a diagnostic challenge with the hormonal source difficult to find. This study analyzes the accuracy of imaging studies in ECS localization. Systematic review of medical literature for ECS case series providing individual patient data on at least one conventional imaging technique (computed tomography [CT]/magnetic resonance imaging) and one of the following: 111In-pentetreotide (OCT), 131I/123I-metaiodobenzylguanidine, 18Ffluoro-2-deoxyglucose-positron emission tomography (FDG-PET), 18F-fluorodopa-PET (F-DOPA-PET), 68Ga- DOTATATE-PET/CT or 68Ga-DOTATOC-PET/CT scan (68Gallium-SSTR-PET/CT). The analysis comprised 231 patients (females, 50.2%; age, 42.617 y). Overall, 52.4%(121/231) had "overt" ECS,18.6% had "occult" ECS, and 29% had "covert" ECS. Tumors were located in the lung (55.3%), mediastinum-thymus (7.9%), pancreas (8.5%), adrenal glands (6.4%), gastrointestinal tract (5.4%), thyroid (3.7%), and other sites (12.8%), and primary tumors were mostly bronchial neuroendocrine tumors (NETs) (54.8%), pancreatic NETs (8%), mediastinum-thymus NETs (6.9%), gastrointestinal NETs (5.3%), pheochromocytoma (6.4%), neuroblastoma (3.2%), and medullary thyroid carcinoma (3.2%). Tumors were localized byCTin66.2%(137/207), magnetic resonance imaging in 51.5% (53/103), OCT in 48.9% (84/172), FDG-PET in 51.7% (46/89), F-DOPAPET in 57.1% (12/21), 131/123I-metaiodobenzylguanidine in 30.8% (4/13), and 68Gallium-SSTRPET/CT in 81.8% (18/22) of cases. Molecular imaging discovered 79.1% (53/67) of tumors unidentified by conventional radiology, with OCT the most commonly used, revealing the tumor in 64%, followed by FDG-PET in 59.4%. F-DOPA-PET was used in only seven covert cases (sensitivity, 85.7%). Notably, 68Gallium-SSTR-PET/CT had 100% sensitivity among covert cases. Nuclear medicine improves the sensitivity of conventional radiology when tumor site identification is problematic. OCT offers a good availability/reliability ratio, and FDG-PET was proven useful. 68Gallium-SSTR-PET/CT use was infrequent, despite offering the highest sensitivity.

  12. Numerical Control/Computer Aided Manufacturing (NC/CAM), A Descom Study

    DTIC Science & Technology

    1979-07-01

    CAM machines operate directly from computers, but most get instructions in the form of punched tape. The applications of NC/CAM are virtually...Although most NC/CAM equipment is metal working, its applications include electronics manufacturing, glass making, food processing, materiel handling...drafting, woodworking, plastics and inspection, just to name a few. Numerical control, like most technologies, is an advancing and evolutionary process

  13. Supermultiplicative Speedups of Probabilistic Model-Building Genetic Algorithms

    DTIC Science & Technology

    2009-02-01

    physicists as well as practitioners in evolutionary computation. The project was later extended to the one-dimensional SK spin glass with power -law... Brasil ) 10. Yuji Sato (Hosei University, Japan) 11. Shunsukc Saruwatari (Tokyo University, Japan) 12. Jian-Hung Chen (Feng Chia University, Taiwan...scalability. In A. Tiwari, J. Knowlcs, E. Avincri, K. Dahal, and R. Roy (Eds.) Applications of Soft Computing: Recent Trends. Berlin: Springer (2006

  14. Building v/s Exploring Models: Comparing Learning of Evolutionary Processes through Agent-based Modeling

    NASA Astrophysics Data System (ADS)

    Wagh, Aditi

    Two strands of work motivate the three studies in this dissertation. Evolutionary change can be viewed as a computational complex system in which a small set of rules operating at the individual level result in different population level outcomes under different conditions. Extensive research has documented students' difficulties with learning about evolutionary change (Rosengren et al., 2012), particularly in terms of levels slippage (Wilensky & Resnick, 1999). Second, though building and using computational models is becoming increasingly common in K-12 science education, we know little about how these two modalities compare. This dissertation adopts agent-based modeling as a representational system to compare these modalities in the conceptual context of micro-evolutionary processes. Drawing on interviews, Study 1 examines middle-school students' productive ways of reasoning about micro-evolutionary processes to find that the specific framing of traits plays a key role in whether slippage explanations are cued. Study 2, which was conducted in 2 schools with about 150 students, forms the crux of the dissertation. It compares learning processes and outcomes when students build their own models or explore a pre-built model. Analysis of Camtasia videos of student pairs reveals that builders' and explorers' ways of accessing rules, and sense-making of observed trends are of a different character. Builders notice rules through available blocks-based primitives, often bypassing their enactment while explorers attend to rules primarily through the enactment. Moreover, builders' sense-making of observed trends is more rule-driven while explorers' is more enactment-driven. Pre and posttests reveal that builders manifest a greater facility with accessing rules, providing explanations manifesting targeted assembly. Explorers use rules to construct explanations manifesting non-targeted assembly. Interviews reveal varying degrees of shifts away from slippage in both modalities, with students who built models not incorporating slippage explanations in responses. Study 3 compares these modalities with a control using traditional activities. Pre and posttests reveal that the two modalities manifested greater facility with accessing and assembling rules than the control. The dissertation offers implications for the design of learning environments for evolutionary change, design of the two modalities based on their strengths and weaknesses, and teacher training for the same.

  15. More efficient evolutionary strategies for model calibration with watershed model for demonstration

    NASA Astrophysics Data System (ADS)

    Baggett, J. S.; Skahill, B. E.

    2008-12-01

    Evolutionary strategies allow automatic calibration of more complex models than traditional gradient based approaches, but they are more computationally intensive. We present several efficiency enhancements for evolution strategies, many of which are not new, but when combined have been shown to dramatically decrease the number of model runs required for calibration of synthetic problems. To reduce the number of expensive model runs we employ a surrogate objective function for an adaptively determined fraction of the population at each generation (Kern et al., 2006). We demonstrate improvements to the adaptive ranking strategy that increase its efficiency while sacrificing little reliability and further reduce the number of model runs required in densely sampled parts of parameter space. Furthermore, we include a gradient individual in each generation that is usually not selected when the search is in a global phase or when the derivatives are poorly approximated, but when selected near a smooth local minimum can dramatically increase convergence speed (Tahk et al., 2007). Finally, the selection of the gradient individual is used to adapt the size of the population near local minima. We show, by incorporating these enhancements into the Covariance Matrix Adaption Evolution Strategy (CMAES; Hansen, 2006), that their synergetic effect is greater than their individual parts. This hybrid evolutionary strategy exploits smooth structure when it is present but degrades to an ordinary evolutionary strategy, at worst, if smoothness is not present. Calibration of 2D-3D synthetic models with the modified CMAES requires approximately 10%-25% of the model runs of ordinary CMAES. Preliminary demonstration of this hybrid strategy will be shown for watershed model calibration problems. Hansen, N. (2006). The CMA Evolution Strategy: A Comparing Review. In J.A. Lozano, P. Larrañga, I. Inza and E. Bengoetxea (Eds.). Towards a new evolutionary computation. Advances in estimation of distribution algorithms. pp. 75-102, Springer Kern, S., N. Hansen and P. Koumoutsakos (2006). Local Meta-Models for Optimization Using Evolution Strategies. In Ninth International Conference on Parallel Problem Solving from Nature PPSN IX, Proceedings, pp.939-948, Berlin: Springer. Tahk, M., Woo, H., and Park. M, (2007). A hybrid optimization of evolutionary and gradient search. Engineering Optimization, (39), 87-104.

  16. On Improving Efficiency of Differential Evolution for Aerodynamic Shape Optimization Applications

    NASA Technical Reports Server (NTRS)

    Madavan, Nateri K.

    2004-01-01

    Differential Evolution (DE) is a simple and robust evolutionary strategy that has been provEn effective in determining the global optimum for several difficult optimization problems. Although DE offers several advantages over traditional optimization approaches, its use in applications such as aerodynamic shape optimization where the objective function evaluations are computationally expensive is limited by the large number of function evaluations often required. In this paper various approaches for improving the efficiency of DE are reviewed and discussed. Several approaches that have proven effective for other evolutionary algorithms are modified and implemented in a DE-based aerodynamic shape optimization method that uses a Navier-Stokes solver for the objective function evaluations. Parallelization techniques on distributed computers are used to reduce turnaround times. Results are presented for standard test optimization problems and for the inverse design of a turbine airfoil. The efficiency improvements achieved by the different approaches are evaluated and compared.

  17. Evolutionary neurobiology and aesthetics.

    PubMed

    Smith, Christopher Upham

    2005-01-01

    If aesthetics is a human universal, it should have a neurobiological basis. Although use of all the senses is, as Aristotle noted, pleasurable, the distance senses are primarily involved in aesthetics. The aesthetic response emerges from the central processing of sensory input. This occurs very rapidly, beneath the level of consciousness, and only the feeling of pleasure emerges into the conscious mind. This is exemplified by landscape appreciation, where it is suggested that a computation built into the nervous system during Paleolithic hunter-gathering is at work. Another inbuilt computation leading to an aesthetic response is the part-whole relationship. This, it is argued, may be traced to the predator-prey "arms races" of evolutionary history. Mate selection also may be responsible for part of our response to landscape and visual art. Aesthetics lies at the core of human mentality, and its study is consequently of importance not only to philosophers and art critics but also to neurobiologists.

  18. A hybrid multi-objective evolutionary algorithm for wind-turbine blade optimization

    NASA Astrophysics Data System (ADS)

    Sessarego, M.; Dixon, K. R.; Rival, D. E.; Wood, D. H.

    2015-08-01

    A concurrent-hybrid non-dominated sorting genetic algorithm (hybrid NSGA-II) has been developed and applied to the simultaneous optimization of the annual energy production, flapwise root-bending moment and mass of the NREL 5 MW wind-turbine blade. By hybridizing a multi-objective evolutionary algorithm (MOEA) with gradient-based local search, it is believed that the optimal set of blade designs could be achieved in lower computational cost than for a conventional MOEA. To measure the convergence between the hybrid and non-hybrid NSGA-II on a wind-turbine blade optimization problem, a computationally intensive case was performed using the non-hybrid NSGA-II. From this particular case, a three-dimensional surface representing the optimal trade-off between the annual energy production, flapwise root-bending moment and blade mass was achieved. The inclusion of local gradients in the blade optimization, however, shows no improvement in the convergence for this three-objective problem.

  19. Decentralized Grid Scheduling with Evolutionary Fuzzy Systems

    NASA Astrophysics Data System (ADS)

    Fölling, Alexander; Grimme, Christian; Lepping, Joachim; Papaspyrou, Alexander

    In this paper, we address the problem of finding workload exchange policies for decentralized Computational Grids using an Evolutionary Fuzzy System. To this end, we establish a non-invasive collaboration model on the Grid layer which requires minimal information about the participating High Performance and High Throughput Computing (HPC/HTC) centers and which leaves the local resource managers completely untouched. In this environment of fully autonomous sites, independent users are assumed to submit their jobs to the Grid middleware layer of their local site, which in turn decides on the delegation and execution either on the local system or on remote sites in a situation-dependent, adaptive way. We find for different scenarios that the exchange policies show good performance characteristics not only with respect to traditional metrics such as average weighted response time and utilization, but also in terms of robustness and stability in changing environments.

  20. Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory

    NASA Astrophysics Data System (ADS)

    Matsumura, Koki; Kawamoto, Masaru

    This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.

  1. Evolutionary psychology: new perspectives on cognition and motivation.

    PubMed

    Cosmides, Leda; Tooby, John

    2013-01-01

    Evolutionary psychology is the second wave of the cognitive revolution. The first wave focused on computational processes that generate knowledge about the world: perception, attention, categorization, reasoning, learning, and memory. The second wave views the brain as composed of evolved computational systems, engineered by natural selection to use information to adaptively regulate physiology and behavior. This shift in focus--from knowledge acquisition to the adaptive regulation of behavior--provides new ways of thinking about every topic in psychology. It suggests a mind populated by a large number of adaptive specializations, each equipped with content-rich representations, concepts, inference systems, and regulatory variables, which are functionally organized to solve the complex problems of survival and reproduction encountered by the ancestral hunter-gatherers from whom we are descended. We present recent empirical examples that illustrate how this approach has been used to discover new features of attention, categorization, reasoning, learning, emotion, and motivation.

  2. Recombinant transfer in the basic genome of E. coli

    DOE PAGES

    Dixit, Purushottam; Studier, F. William; Pang, Tin Yau; ...

    2015-07-07

    An approximation to the ~4-Mbp basic genome shared by 32 strains of E. coli representing six evolutionary groups has been derived and analyzed computationally. A multiple-alignment of the 32 complete genome sequences was filtered to remove mobile elements and identify the most reliable ~90% of the aligned length of each of the resulting 496 basic-genome pairs. Patterns of single bp mutations (SNPs) in aligned pairs distinguish clonally inherited regions from regions where either genome has acquired DNA fragments from diverged genomes by homologous recombination since their last common ancestor. Such recombinant transfer is pervasive across the basic genome, mostly betweenmore » genomes in the same evolutionary group, and generates many unique mosaic patterns. The six least-diverged genome-pairs have one or two recombinant transfers of length ~40–115 kbp (and few if any other transfers), each containing one or more gene clusters known to confer strong selective advantage in some environments. Moderately diverged genome pairs (0.4–1% SNPs) show mosaic patterns of interspersed clonal and recombinant regions of varying lengths throughout the basic genome, whereas more highly diverged pairs within an evolutionary group or pairs between evolutionary groups having >1.3% SNPs have few clonal matches longer than a few kbp. Many recombinant transfers appear to incorporate fragments of the entering DNA produced by restriction systems of the recipient cell. A simple computational model can closely fit the data. As a result, most recombinant transfers seem likely to be due to generalized transduction by co-evolving populations of phages, which could efficiently distribute variability throughout bacterial genomes.« less

  3. Recombinant transfer in the basic genome of E. coli

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dixit, Purushottam; Studier, F. William; Pang, Tin Yau

    An approximation to the ~4-Mbp basic genome shared by 32 strains of E. coli representing six evolutionary groups has been derived and analyzed computationally. A multiple-alignment of the 32 complete genome sequences was filtered to remove mobile elements and identify the most reliable ~90% of the aligned length of each of the resulting 496 basic-genome pairs. Patterns of single bp mutations (SNPs) in aligned pairs distinguish clonally inherited regions from regions where either genome has acquired DNA fragments from diverged genomes by homologous recombination since their last common ancestor. Such recombinant transfer is pervasive across the basic genome, mostly betweenmore » genomes in the same evolutionary group, and generates many unique mosaic patterns. The six least-diverged genome-pairs have one or two recombinant transfers of length ~40–115 kbp (and few if any other transfers), each containing one or more gene clusters known to confer strong selective advantage in some environments. Moderately diverged genome pairs (0.4–1% SNPs) show mosaic patterns of interspersed clonal and recombinant regions of varying lengths throughout the basic genome, whereas more highly diverged pairs within an evolutionary group or pairs between evolutionary groups having >1.3% SNPs have few clonal matches longer than a few kbp. Many recombinant transfers appear to incorporate fragments of the entering DNA produced by restriction systems of the recipient cell. A simple computational model can closely fit the data. As a result, most recombinant transfers seem likely to be due to generalized transduction by co-evolving populations of phages, which could efficiently distribute variability throughout bacterial genomes.« less

  4. Evolutionary dynamics on graphs: Efficient method for weak selection

    NASA Astrophysics Data System (ADS)

    Fu, Feng; Wang, Long; Nowak, Martin A.; Hauert, Christoph

    2009-04-01

    Investigating the evolutionary dynamics of game theoretical interactions in populations where individuals are arranged on a graph can be challenging in terms of computation time. Here, we propose an efficient method to study any type of game on arbitrary graph structures for weak selection. In this limit, evolutionary game dynamics represents a first-order correction to neutral evolution. Spatial correlations can be empirically determined under neutral evolution and provide the basis for formulating the game dynamics as a discrete Markov process by incorporating a detailed description of the microscopic dynamics based on the neutral correlations. This framework is then applied to one of the most intriguing questions in evolutionary biology: the evolution of cooperation. We demonstrate that the degree heterogeneity of a graph impedes cooperation and that the success of tit for tat depends not only on the number of rounds but also on the degree of the graph. Moreover, considering the mutation-selection equilibrium shows that the symmetry of the stationary distribution of states under weak selection is skewed in favor of defectors for larger selection strengths. In particular, degree heterogeneity—a prominent feature of scale-free networks—generally results in a more pronounced increase in the critical benefit-to-cost ratio required for evolution to favor cooperation as compared to regular graphs. This conclusion is corroborated by an analysis of the effects of population structures on the fixation probabilities of strategies in general 2×2 games for different types of graphs. Computer simulations confirm the predictive power of our method and illustrate the improved accuracy as compared to previous studies.

  5. Discovery of an O-mannosylation pathway selectively serving cadherins and protocadherins.

    PubMed

    Larsen, Ida Signe Bohse; Narimatsu, Yoshiki; Joshi, Hiren Jitendra; Siukstaite, Lina; Harrison, Oliver J; Brasch, Julia; Goodman, Kerry M; Hansen, Lars; Shapiro, Lawrence; Honig, Barry; Vakhrushev, Sergey Y; Clausen, Henrik; Halim, Adnan

    2017-10-17

    The cadherin (cdh) superfamily of adhesion molecules carry O-linked mannose (O-Man) glycans at highly conserved sites localized to specific β-strands of their extracellular cdh (EC) domains. These O-Man glycans do not appear to be elongated like O-Man glycans found on α-dystroglycan (α-DG), and we recently demonstrated that initiation of cdh/protocadherin (pcdh) O-Man glycosylation is not dependent on the evolutionary conserved POMT1/POMT2 enzymes that initiate O-Man glycosylation on α-DG. Here, we used a CRISPR/Cas9 genetic dissection strategy combined with sensitive and quantitative O-Man glycoproteomics to identify a homologous family of four putative protein O-mannosyltransferases encoded by the TMTC1-4 genes, which were found to be imperative for cdh and pcdh O-Man glycosylation. KO of all four TMTC genes in HEK293 cells resulted in specific loss of cdh and pcdh O-Man glycosylation, whereas combined KO of TMTC1 and TMTC3 resulted in selective loss of O-Man glycans on specific β-strands of EC domains, suggesting that each isoenzyme serves a different function. In addition, O-Man glycosylation of IPT/TIG domains of plexins and hepatocyte growth factor receptor was not affected in TMTC KO cells, suggesting the existence of yet another O-Man glycosylation machinery. Our study demonstrates that regulation of O-mannosylation in higher eukaryotes is more complex than envisioned, and the discovery of the functions of TMTCs provide insight into cobblestone lissencephaly caused by deficiency in TMTC3.

  6. Computational complexity of ecological and evolutionary spatial dynamics

    PubMed Central

    Ibsen-Jensen, Rasmus; Chatterjee, Krishnendu; Nowak, Martin A.

    2015-01-01

    There are deep, yet largely unexplored, connections between computer science and biology. Both disciplines examine how information proliferates in time and space. Central results in computer science describe the complexity of algorithms that solve certain classes of problems. An algorithm is deemed efficient if it can solve a problem in polynomial time, which means the running time of the algorithm is a polynomial function of the length of the input. There are classes of harder problems for which the fastest possible algorithm requires exponential time. Another criterion is the space requirement of the algorithm. There is a crucial distinction between algorithms that can find a solution, verify a solution, or list several distinct solutions in given time and space. The complexity hierarchy that is generated in this way is the foundation of theoretical computer science. Precise complexity results can be notoriously difficult. The famous question whether polynomial time equals nondeterministic polynomial time (i.e., P = NP) is one of the hardest open problems in computer science and all of mathematics. Here, we consider simple processes of ecological and evolutionary spatial dynamics. The basic question is: What is the probability that a new invader (or a new mutant) will take over a resident population? We derive precise complexity results for a variety of scenarios. We therefore show that some fundamental questions in this area cannot be answered by simple equations (assuming that P is not equal to NP). PMID:26644569

  7. An Approximation for Computing Reduction in Bandwidth Requirements Using Intelligent Multiplexers

    DTIC Science & Technology

    1993-03-01

    Naval Postgraduate School__ Sc. ALX)HRSS(City, State, andLPCode) 10 . SOURCE 01- FUNUINU NUMBERS Monterey, CA 93943 ELEGRMEN N RO. EC NO.ORUII EME T O... 10 IV. MODEL DEVELOPMENT ............................................................................ 13 A. THE ERLANG MODEL...channel. Also, the silent periods of a voice or data transmission go unused. For combined voice and data traffic, TDM averages only 10 -25% efficiency

  8. A computational model of the ionic currents, Ca2+ dynamics and action potentials underlying contraction of isolated uterine smooth muscle.

    PubMed

    Tong, Wing-Chiu; Choi, Cecilia Y; Kharche, Sanjay; Karche, Sanjay; Holden, Arun V; Zhang, Henggui; Taggart, Michael J

    2011-04-29

    Uterine contractions during labor are discretely regulated by rhythmic action potentials (AP) of varying duration and form that serve to determine calcium-dependent force production. We have employed a computational biology approach to develop a fuller understanding of the complexity of excitation-contraction (E-C) coupling of uterine smooth muscle cells (USMC). Our overall aim is to establish a mathematical platform of sufficient biophysical detail to quantitatively describe known uterine E-C coupling parameters and thereby inform future empirical investigations of physiological and pathophysiological mechanisms governing normal and dysfunctional labors. From published and unpublished data we construct mathematical models for fourteen ionic currents of USMCs: Ca2+ currents (L- and T-type), Na+ current, an hyperpolarization-activated current, three voltage-gated K+ currents, two Ca2+-activated K+ current, Ca2+-activated Cl current, non-specific cation current, Na+-Ca2+ exchanger, Na+-K+ pump and background current. The magnitudes and kinetics of each current system in a spindle shaped single cell with a specified surface area:volume ratio is described by differential equations, in terms of maximal conductances, electrochemical gradient, voltage-dependent activation/inactivation gating variables and temporal changes in intracellular Ca2+ computed from known Ca2+ fluxes. These quantifications are validated by the reconstruction of the individual experimental ionic currents obtained under voltage-clamp. Phasic contraction is modeled in relation to the time constant of changing [Ca2+]i. This integrated model is validated by its reconstruction of the different USMC AP configurations (spikes, plateau and bursts of spikes), the change from bursting to plateau type AP produced by estradiol and of simultaneous experimental recordings of spontaneous AP, [Ca2+]i and phasic force. In summary, our advanced mathematical model provides a powerful tool to investigate the physiological ionic mechanisms underlying the genesis of uterine electrical E-C coupling of labor and parturition. This will furnish the evolution of descriptive and predictive quantitative models of myometrial electrogenesis at the whole cell and tissue levels.

  9. The Proposal of a Evolutionary Strategy Generating the Data Structures Based on a Horizontal Tree for the Tests

    NASA Astrophysics Data System (ADS)

    Żukowicz, Marek; Markiewicz, Michał

    2016-09-01

    The aim of the article is to present a mathematical definition of the object model, that is known in computer science as TreeList and to show application of this model for design evolutionary algorithm, that purpose is to generate structures based on this object. The first chapter introduces the reader to the problem of presenting data using the TreeList object. The second chapter describes the problem of testing data structures based on TreeList. The third one shows a mathematical model of the object TreeList and the parameters, used in determining the utility of structures created through this model and in evolutionary strategy, that generates these structures for testing purposes. The last chapter provides a brief summary and plans for future research related to the algorithm presented in the article.

  10. Parametric Sensitivity Analysis of Oscillatory Delay Systems with an Application to Gene Regulation.

    PubMed

    Ingalls, Brian; Mincheva, Maya; Roussel, Marc R

    2017-07-01

    A parametric sensitivity analysis for periodic solutions of delay-differential equations is developed. Because phase shifts cause the sensitivity coefficients of a periodic orbit to diverge, we focus on sensitivities of the extrema, from which amplitude sensitivities are computed, and of the period. Delay-differential equations are often used to model gene expression networks. In these models, the parametric sensitivities of a particular genotype define the local geometry of the evolutionary landscape. Thus, sensitivities can be used to investigate directions of gradual evolutionary change. An oscillatory protein synthesis model whose properties are modulated by RNA interference is used as an example. This model consists of a set of coupled delay-differential equations involving three delays. Sensitivity analyses are carried out at several operating points. Comments on the evolutionary implications of the results are offered.

  11. Detecting and Analyzing Genetic Recombination Using RDP4.

    PubMed

    Martin, Darren P; Murrell, Ben; Khoosal, Arjun; Muhire, Brejnev

    2017-01-01

    Recombination between nucleotide sequences is a major process influencing the evolution of most species on Earth. The evolutionary value of recombination has been widely debated and so too has its influence on evolutionary analysis methods that assume nucleotide sequences replicate without recombining. When nucleic acids recombine, the evolution of the daughter or recombinant molecule cannot be accurately described by a single phylogeny. This simple fact can seriously undermine the accuracy of any phylogenetics-based analytical approach which assumes that the evolutionary history of a set of recombining sequences can be adequately described by a single phylogenetic tree. There are presently a large number of available methods and associated computer programs for analyzing and characterizing recombination in various classes of nucleotide sequence datasets. Here we examine the use of some of these methods to derive and test recombination hypotheses using multiple sequence alignments.

  12. Upon accounting for the impact of isoenzyme loss, gene deletion costs anticorrelate with their evolutionary rates

    DOE PAGES

    Jacobs, Christopher; Lambourne, Luke; Xia, Yu; ...

    2017-01-20

    Here, system-level metabolic network models enable the computation of growth and metabolic phenotypes from an organism's genome. In particular, flux balance approaches have been used to estimate the contribution of individual metabolic genes to organismal fitness, offering the opportunity to test whether such contributions carry information about the evolutionary pressure on the corresponding genes. Previous failure to identify the expected negative correlation between such computed gene-loss cost and sequence-derived evolutionary rates in Saccharomyces cerevisiae has been ascribed to a real biological gap between a gene's fitness contribution to an organism "here and now"º and the same gene's historical importance asmore » evidenced by its accumulated mutations over millions of years of evolution. Here we show that this negative correlation does exist, and can be exposed by revisiting a broadly employed assumption of flux balance models. In particular, we introduce a new metric that we call "function-loss cost", which estimates the cost of a gene loss event as the total potential functional impairment caused by that loss. This new metric displays significant negative correlation with evolutionary rate, across several thousand minimal environments. We demonstrate that the improvement gained using function-loss cost over gene-loss cost is explained by replacing the base assumption that isoenzymes provide unlimited capacity for backup with the assumption that isoenzymes are completely non-redundant. We further show that this change of the assumption regarding isoenzymes increases the recall of epistatic interactions predicted by the flux balance model at the cost of a reduction in the precision of the predictions. In addition to suggesting that the gene-to-reaction mapping in genome-scale flux balance models should be used with caution, our analysis provides new evidence that evolutionary gene importance captures much more than strict essentiality.« less

  13. Protein 3D Structure Computed from Evolutionary Sequence Variation

    PubMed Central

    Sheridan, Robert; Hopf, Thomas A.; Pagnani, Andrea; Zecchina, Riccardo; Sander, Chris

    2011-01-01

    The evolutionary trajectory of a protein through sequence space is constrained by its function. Collections of sequence homologs record the outcomes of millions of evolutionary experiments in which the protein evolves according to these constraints. Deciphering the evolutionary record held in these sequences and exploiting it for predictive and engineering purposes presents a formidable challenge. The potential benefit of solving this challenge is amplified by the advent of inexpensive high-throughput genomic sequencing. In this paper we ask whether we can infer evolutionary constraints from a set of sequence homologs of a protein. The challenge is to distinguish true co-evolution couplings from the noisy set of observed correlations. We address this challenge using a maximum entropy model of the protein sequence, constrained by the statistics of the multiple sequence alignment, to infer residue pair couplings. Surprisingly, we find that the strength of these inferred couplings is an excellent predictor of residue-residue proximity in folded structures. Indeed, the top-scoring residue couplings are sufficiently accurate and well-distributed to define the 3D protein fold with remarkable accuracy. We quantify this observation by computing, from sequence alone, all-atom 3D structures of fifteen test proteins from different fold classes, ranging in size from 50 to 260 residues., including a G-protein coupled receptor. These blinded inferences are de novo, i.e., they do not use homology modeling or sequence-similar fragments from known structures. The co-evolution signals provide sufficient information to determine accurate 3D protein structure to 2.7–4.8 Å Cα-RMSD error relative to the observed structure, over at least two-thirds of the protein (method called EVfold, details at http://EVfold.org). This discovery provides insight into essential interactions constraining protein evolution and will facilitate a comprehensive survey of the universe of protein structures, new strategies in protein and drug design, and the identification of functional genetic variants in normal and disease genomes. PMID:22163331

  14. Upon accounting for the impact of isoenzyme loss, gene deletion costs anticorrelate with their evolutionary rates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobs, Christopher; Lambourne, Luke; Xia, Yu

    Here, system-level metabolic network models enable the computation of growth and metabolic phenotypes from an organism's genome. In particular, flux balance approaches have been used to estimate the contribution of individual metabolic genes to organismal fitness, offering the opportunity to test whether such contributions carry information about the evolutionary pressure on the corresponding genes. Previous failure to identify the expected negative correlation between such computed gene-loss cost and sequence-derived evolutionary rates in Saccharomyces cerevisiae has been ascribed to a real biological gap between a gene's fitness contribution to an organism "here and now"º and the same gene's historical importance asmore » evidenced by its accumulated mutations over millions of years of evolution. Here we show that this negative correlation does exist, and can be exposed by revisiting a broadly employed assumption of flux balance models. In particular, we introduce a new metric that we call "function-loss cost", which estimates the cost of a gene loss event as the total potential functional impairment caused by that loss. This new metric displays significant negative correlation with evolutionary rate, across several thousand minimal environments. We demonstrate that the improvement gained using function-loss cost over gene-loss cost is explained by replacing the base assumption that isoenzymes provide unlimited capacity for backup with the assumption that isoenzymes are completely non-redundant. We further show that this change of the assumption regarding isoenzymes increases the recall of epistatic interactions predicted by the flux balance model at the cost of a reduction in the precision of the predictions. In addition to suggesting that the gene-to-reaction mapping in genome-scale flux balance models should be used with caution, our analysis provides new evidence that evolutionary gene importance captures much more than strict essentiality.« less

  15. Faster Evolution of More Multifunctional Logic Circuits

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian; Zebulum, Ricardo

    2005-01-01

    A modification in a method of automated evolutionary synthesis of voltage-controlled multifunctional logic circuits makes it possible to synthesize more circuits in less time. Prior to the modification, the computations for synthesizing a four-function logic circuit by this method took about 10 hours. Using the method as modified, it is possible to synthesize a six-function circuit in less than half an hour. The concepts of automated evolutionary synthesis and voltage-controlled multifunctional logic circuits were described in a number of prior NASA Tech Briefs articles. To recapitulate: A circuit is designed to perform one of several different logic functions, depending on the value of an applied control voltage. The circuit design is synthesized following an automated evolutionary approach that is so named because it is modeled partly after the repetitive trial-and-error process of biological evolution. In this process, random populations of integer strings that encode electronic circuits play a role analogous to that of chromosomes. An evolved circuit is tested by computational simulation (prior to testing in real hardware to verify a final design). Then, in a fitness-evaluation step, responses of the circuit are compared with specifications of target responses and circuits are ranked according to how close they come to satisfying specifications. The results of the evaluation provide guidance for refining designs through further iteration.

  16. Cost versus life cycle assessment-based environmental impact optimization of drinking water production plants.

    PubMed

    Capitanescu, F; Rege, S; Marvuglia, A; Benetto, E; Ahmadi, A; Gutiérrez, T Navarrete; Tiruta-Barna, L

    2016-07-15

    Empowering decision makers with cost-effective solutions for reducing industrial processes environmental burden, at both design and operation stages, is nowadays a major worldwide concern. The paper addresses this issue for the sector of drinking water production plants (DWPPs), seeking for optimal solutions trading-off operation cost and life cycle assessment (LCA)-based environmental impact while satisfying outlet water quality criteria. This leads to a challenging bi-objective constrained optimization problem, which relies on a computationally expensive intricate process-modelling simulator of the DWPP and has to be solved with limited computational budget. Since mathematical programming methods are unusable in this case, the paper examines the performances in tackling these challenges of six off-the-shelf state-of-the-art global meta-heuristic optimization algorithms, suitable for such simulation-based optimization, namely Strength Pareto Evolutionary Algorithm (SPEA2), Non-dominated Sorting Genetic Algorithm (NSGA-II), Indicator-based Evolutionary Algorithm (IBEA), Multi-Objective Evolutionary Algorithm based on Decomposition (MOEA/D), Differential Evolution (DE), and Particle Swarm Optimization (PSO). The results of optimization reveal that good reduction in both operating cost and environmental impact of the DWPP can be obtained. Furthermore, NSGA-II outperforms the other competing algorithms while MOEA/D and DE perform unexpectedly poorly. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Computational methods using weighed-extreme learning machine to predict protein self-interactions with protein evolutionary information.

    PubMed

    An, Ji-Yong; Zhang, Lei; Zhou, Yong; Zhao, Yu-Jun; Wang, Da-Fu

    2017-08-18

    Self-interactions Proteins (SIPs) is important for their biological activity owing to the inherent interaction amongst their secondary structures or domains. However, due to the limitations of experimental Self-interactions detection, one major challenge in the study of prediction SIPs is how to exploit computational approaches for SIPs detection based on evolutionary information contained protein sequence. In the work, we presented a novel computational approach named WELM-LAG, which combined the Weighed-Extreme Learning Machine (WELM) classifier with Local Average Group (LAG) to predict SIPs based on protein sequence. The major improvement of our method lies in presenting an effective feature extraction method used to represent candidate Self-interactions proteins by exploring the evolutionary information embedded in PSI-BLAST-constructed position specific scoring matrix (PSSM); and then employing a reliable and robust WELM classifier to carry out classification. In addition, the Principal Component Analysis (PCA) approach is used to reduce the impact of noise. The WELM-LAG method gave very high average accuracies of 92.94 and 96.74% on yeast and human datasets, respectively. Meanwhile, we compared it with the state-of-the-art support vector machine (SVM) classifier and other existing methods on human and yeast datasets, respectively. Comparative results indicated that our approach is very promising and may provide a cost-effective alternative for predicting SIPs. In addition, we developed a freely available web server called WELM-LAG-SIPs to predict SIPs. The web server is available at http://219.219.62.123:8888/WELMLAG/ .

  18. Nature-Inspired Cognitive Evolution to Play MS. Pac-Man

    NASA Astrophysics Data System (ADS)

    Tan, Tse Guan; Teo, Jason; Anthony, Patricia

    Recent developments in nature-inspired computation have heightened the need for research into the three main areas of scientific, engineering and industrial applications. Some approaches have reported that it is able to solve dynamic problems and very useful for improving the performance of various complex systems. So far however, there has been little discussion about the effectiveness of the application of these models to computer and video games in particular. The focus of this research is to explore the hybridization of nature-inspired computation methods for optimization of neural network-based cognition in video games, in this case the combination of a neural network with an evolutionary algorithm. In essence, a neural network is an attempt to mimic the extremely complex human brain system, which is building an artificial brain that is able to self-learn intelligently. On the other hand, an evolutionary algorithm is to simulate the biological evolutionary processes that evolve potential solutions in order to solve the problems or tasks by applying the genetic operators such as crossover, mutation and selection into the solutions. This paper investigates the abilities of Evolution Strategies (ES) to evolve feed-forward artificial neural network's internal parameters (i.e. weight and bias values) for automatically generating Ms. Pac-man controllers. The main objective of this game is to clear a maze of dots while avoiding the ghosts and to achieve the highest possible score. The experimental results show that an ES-based system can be successfully applied to automatically generate artificial intelligence for a complex, dynamic and highly stochastic video game environment.

  19. Training cognitive control in older adults with the space fortress game: the role of training instructions and basic motor ability.

    PubMed

    Blumen, Helena M; Gopher, Daniel; Steinerman, Joshua R; Stern, Yaakov

    2010-01-01

    This study examined if and how cognitively healthy older adults can learn to play a complex computer-based action game called the Space Fortress (SF) as a function of training instructions [Standard vs. Emphasis Change (EC); e.g., Gopher et al., 1989] and basic motor ability. A total of 35 cognitively healthy older adults completed a 3-month SF training program with three SF sessions weekly. Twelve 3-min games were played during each session. Basic motor ability was assessed with an aiming task, which required rapidly rotating a spaceship to shoot targets. Older adults showed improved performance on the SF task over time, but did not perform at the same level as younger adults. Unlike studies of younger adults, overall SF performance in older adults was greater following standard instructions than following EC instructions. However, this advantage was primarily due to collecting more bonus points and not - the primary goal of the game - shooting and destroying the fortress, which in contrast benefited from EC instructions. Basic motor ability was low and influenced many different aspects of SF game learning, often interacted with learning rate, and influenced overall SF performance. These findings show that older adults can be trained to deal with the complexity of the SF task but that overall SF performance, and the ability to capitalize on EC instructions, differs when a basic ability such as motor control is low. Hence, the development of this training program as a cognitive intervention that can potentially compensate for age-related cognitive decline should consider that basic motor ability can interact with the efficiency of training instructions that promote the use of cognitive control (e.g., EC instructions) - and the confluence between such basic abilities and higher-level cognitive control abilities should be further examined.

  20. Eddy covariance carbonyl sulphide flux measurements with a quantum cascade laser absorption spectrometer

    PubMed Central

    Gerdel, Katharina; Spielmann, Felix Maximilian; Hammerle, Albin; Wohlfahrt, Georg

    2017-01-01

    The trace gas carbonyl sulphide (COS) has lately received growing interest in the eddy covariance (EC) community due to its potential to serve as an independent approach for constraining gross primary production and canopy stomatal conductance. Thanks to recent developments of fast-response high-precision trace gas analysers (e.g. quantum cascade laser absorption spectrometers (QCLAS)), a handful of EC COS flux measurements have been published since 2013. To date, however, a thorough methodological characterisation of QCLAS with regard to the requirements of the EC technique and the necessary processing steps has not been conducted. The objective of this study is to present a detailed characterization of the COS measurement with the Aerodyne QCLAS in the context of the EC technique, and to recommend best EC processing practices for those measurements. Data were collected from May to October 2015 at a temperate mountain grassland in Tyrol, Austria. Analysis of the Allan variance of high-frequency concentration measurements revealed sensor drift to occur under field conditions after an averaging time of around 50 s. We thus explored the use of two high-pass filtering approaches (linear detrending and recursive filtering) as opposed to block averaging and linear interpolation of regular background measurements for covariance computation. Experimental low-pass filtering correction factors were derived from a detailed cospectral analysis. The CO2 and H2O flux measurements obtained with the QCLAS were compared against those obtained with a closed-path infrared gas analyser. Overall, our results suggest small, but systematic differences between the various high-pass filtering scenarios with regard to the fraction of data retained in the quality control and flux magnitudes. When COS and CO2 fluxes are combined in the so-called ecosystem relative uptake rate, systematic differences between the high-pass filtering scenarios largely cancel out, suggesting that this relative metric represents a robust key parameter comparable between studies relying on different post-processing schemes. PMID:29093762

  1. Eddy covariance carbonyl sulphide flux measurements with a quantum cascade laser absorption spectrometer.

    PubMed

    Gerdel, Katharina; Spielmann, Felix Maximilian; Hammerle, Albin; Wohlfahrt, Georg

    2017-09-26

    The trace gas carbonyl sulphide (COS) has lately received growing interest in the eddy covariance (EC) community due to its potential to serve as an independent approach for constraining gross primary production and canopy stomatal conductance. Thanks to recent developments of fast-response high-precision trace gas analysers (e.g. quantum cascade laser absorption spectrometers (QCLAS)), a handful of EC COS flux measurements have been published since 2013. To date, however, a thorough methodological characterisation of QCLAS with regard to the requirements of the EC technique and the necessary processing steps has not been conducted. The objective of this study is to present a detailed characterization of the COS measurement with the Aerodyne QCLAS in the context of the EC technique, and to recommend best EC processing practices for those measurements. Data were collected from May to October 2015 at a temperate mountain grassland in Tyrol, Austria. Analysis of the Allan variance of high-frequency concentration measurements revealed sensor drift to occur under field conditions after an averaging time of around 50 s. We thus explored the use of two high-pass filtering approaches (linear detrending and recursive filtering) as opposed to block averaging and linear interpolation of regular background measurements for covariance computation. Experimental low-pass filtering correction factors were derived from a detailed cospectral analysis. The CO 2 and H 2 O flux measurements obtained with the QCLAS were compared against those obtained with a closed-path infrared gas analyser. Overall, our results suggest small, but systematic differences between the various high-pass filtering scenarios with regard to the fraction of data retained in the quality control and flux magnitudes. When COS and CO 2 fluxes are combined in the so-called ecosystem relative uptake rate, systematic differences between the high-pass filtering scenarios largely cancel out, suggesting that this relative metric represents a robust key parameter comparable between studies relying on different post-processing schemes.

  2. Characterization of vascular permeability using a biomimetic microfluidic blood vessel model

    PubMed Central

    Thomas, Antony; Wang, Shunqiang; Sohrabi, Salman; Orr, Colin; He, Ran; Shi, Wentao; Liu, Yaling

    2017-01-01

    The inflammatory response in endothelial cells (ECs) leads to an increase in vascular permeability through the formation of gaps. However, the dynamic nature of vascular permeability and external factors involved is still elusive. In this work, we use a biomimetic blood vessel (BBV) microfluidic model to measure in real-time the change in permeability of the EC layer under culture in physiologically relevant flow conditions. This platform studies the dynamics and characterizes vascular permeability when the EC layer is triggered with an inflammatory agent using tracer molecules of three different sizes, and the results are compared to a transwell insert study. We also apply an analytical model to compare the permeability data from the different tracer molecules to understand the physiological and bio-transport significance of endothelial permeability based on the molecule of interest. A computational model of the BBV model is also built to understand the factors influencing transport of molecules of different sizes under flow. The endothelial monolayer cultured under flow in the BBV model was treated with thrombin, a serine protease that induces a rapid and reversible increase in endothelium permeability. On analysis of permeability data, it is found that the transport characteristics for fluorescein isothiocyanate (FITC) dye and FITC Dextran 4k Da molecules are similar in both BBV and transwell models, but FITC Dextran 70k Da molecules show increased permeability in the BBV model as convection flow (Peclet number > 1) influences the molecule transport in the BBV model. We also calculated from permeability data the relative increase in intercellular gap area during thrombin treatment for ECs in the BBV and transwell insert models to be between 12% and 15%. This relative increase was found to be within range of what we quantified from F-actin stained EC layer images. The work highlights the importance of incorporating flow in in vitro vascular models, especially in studies involving transport of large size objects such as antibodies, proteins, nano/micro particles, and cells. PMID:28344727

  3. Training Cognitive Control in Older Adults with the Space Fortress Game: The Role of Training Instructions and Basic Motor Ability

    PubMed Central

    Blumen, Helena M.; Gopher, Daniel; Steinerman, Joshua R.; Stern, Yaakov

    2010-01-01

    This study examined if and how cognitively healthy older adults can learn to play a complex computer-based action game called the Space Fortress (SF) as a function of training instructions [Standard vs. Emphasis Change (EC); e.g., Gopher et al., 1989] and basic motor ability. A total of 35 cognitively healthy older adults completed a 3-month SF training program with three SF sessions weekly. Twelve 3-min games were played during each session. Basic motor ability was assessed with an aiming task, which required rapidly rotating a spaceship to shoot targets. Older adults showed improved performance on the SF task over time, but did not perform at the same level as younger adults. Unlike studies of younger adults, overall SF performance in older adults was greater following standard instructions than following EC instructions. However, this advantage was primarily due to collecting more bonus points and not – the primary goal of the game – shooting and destroying the fortress, which in contrast benefited from EC instructions. Basic motor ability was low and influenced many different aspects of SF game learning, often interacted with learning rate, and influenced overall SF performance. These findings show that older adults can be trained to deal with the complexity of the SF task but that overall SF performance, and the ability to capitalize on EC instructions, differs when a basic ability such as motor control is low. Hence, the development of this training program as a cognitive intervention that can potentially compensate for age-related cognitive decline should consider that basic motor ability can interact with the efficiency of training instructions that promote the use of cognitive control (e.g., EC instructions) – and the confluence between such basic abilities and higher-level cognitive control abilities should be further examined. PMID:21120135

  4. Eddy covariance carbonyl sulfide flux measurements with a quantum cascade laser absorption spectrometer

    NASA Astrophysics Data System (ADS)

    Gerdel, Katharina; Spielmann, Felix Maximilian; Hammerle, Albin; Wohlfahrt, Georg

    2017-09-01

    The trace gas carbonyl sulfide (COS) has lately received growing interest from the eddy covariance (EC) community due to its potential to serve as an independent approach for constraining gross primary production and canopy stomatal conductance. Thanks to recent developments of fast-response high-precision trace gas analysers (e.g. quantum cascade laser absorption spectrometers, QCLAS), a handful of EC COS flux measurements have been published since 2013. To date, however, a thorough methodological characterisation of QCLAS with regard to the requirements of the EC technique and the necessary processing steps has not been conducted. The objective of this study is to present a detailed characterisation of the COS measurement with the Aerodyne QCLAS in the context of the EC technique and to recommend best EC processing practices for those measurements. Data were collected from May to October 2015 at a temperate mountain grassland in Tyrol, Austria. Analysis of the Allan variance of high-frequency concentration measurements revealed the occurrence of sensor drift under field conditions after an averaging time of around 50 s. We thus explored the use of two high-pass filtering approaches (linear detrending and recursive filtering) as opposed to block averaging and linear interpolation of regular background measurements for covariance computation. Experimental low-pass filtering correction factors were derived from a detailed cospectral analysis. The CO2 and H2O flux measurements obtained with the QCLAS were compared with those obtained with a closed-path infrared gas analyser. Overall, our results suggest small, but systematic differences between the various high-pass filtering scenarios with regard to the fraction of data retained in the quality control and flux magnitudes. When COS and CO2 fluxes are combined in the ecosystem relative uptake rate, systematic differences between the high-pass filtering scenarios largely cancel out, suggesting that this relative metric represents a robust key parameter comparable between studies relying on different post-processing schemes.

  5. The neuromotor effects of transverse friction massage.

    PubMed

    Begovic, Haris; Zhou, Guang-Quan; Schuster, Snježana; Zheng, Yong-Ping

    2016-12-01

    Transverse friction massage (TFM), as an often used technique by therapists, is known for its effect in reducing the pain and loosing the scar tissues. Nevertheless, its effects on neuromotor driving mechanism including the electromechanical delay (EMD), force transmission and excitation-contraction (EC) coupling which could be used as markers of stiffness changes, has not been computed using ultrafast ultrasound (US) when combined with external sensors. Hence, the aim of this study was to find out produced neuromotor changes associated to stiffness when TFM was applied over Quadriceps femoris (QF) tendon in healthy subjcets. Fourteen healthy males and fifteen age-gender matched controls were recruited. Surface EMG (sEMG), ultrafast US and Force sensors were synchronized and signals were analyzed to depict the time delays corresponding to EC coupling, force transmission, EMD, torque and rate of force development (RFD). TFM has been found to increase the time corresponding to EC coupling and EMD, whilst, reducing the time belonging to force transmission during the voluntary muscle contractions. A detection of the increased time of EC coupling from muscle itself would suggest that TFM applied over the tendon shows an influence on changing the neuro-motor driving mechanism possibly via afferent pathways and therefore decreasing the active muscle stiffness. On the other hand, detection of decreased time belonging to force transmission during voluntary contraction would suggest that TFM increases the stiffness of tendon, caused by faster force transmission along non-contractile elements. Torque and RFD have not been influenced by TFM. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. PKA catalytic subunit compartmentation regulates contractile and hypertrophic responses to β-adrenergic signaling

    PubMed Central

    Yang, Jason H.; Polanowska-Grabowska, Renata K.; Smith, Jeffrey S.; Shields, Charles W.; Saucerman, Jeffrey J.

    2014-01-01

    β-adrenergic signaling is spatiotemporally heterogeneous in the cardiac myocyte, conferring exquisite control to sympathetic stimulation. Such heterogeneity drives the formation of protein kinase A (PKA) signaling microdomains, which regulate Ca2+ handling and contractility. Here, we test the hypothesis that the nucleus independently comprises a PKA signaling microdomain regulating myocyte hypertrophy. Spatially-targeted FRET reporters for PKA activity identified slower PKA activation and lower isoproterenol sensitivity in the nucleus (t50 = 10.60±0.68 min; EC50 = 89.00 nmol/L) than in the cytosol (t50 = 3.71±0.25 min; EC50 = 1.22 nmol/L). These differences were not explained by cAMP or AKAP-based compartmentation. A computational model of cytosolic and nuclear PKA activity was developed and predicted that differences in nuclear PKA dynamics and magnitude are regulated by slow PKA catalytic subunit diffusion, while differences in isoproterenol sensitivity are regulated by nuclear expression of protein kinase inhibitor (PKI). These were validated by FRET and immunofluorescence. The model also predicted differential phosphorylation of PKA substrates regulating cell contractility and hypertrophy. Ca2+ and cell hypertrophy measurements validated these predictions and identified higher isoproterenol sensitivity for contractile enhancements (EC50 = 1.84 nmol/L) over cell hypertrophy (EC50 = 85.88 nmol/L). Over-expression of spatially targeted PKA catalytic subunit to the cytosol or nucleus enhanced contractile and hypertrophic responses, respectively. We conclude that restricted PKA catalytic subunit diffusion is an important PKA compartmentation mechanism and the nucleus comprises a novel PKA signaling microdomain, insulating hypertrophic from contractile β-adrenergic signaling responses. PMID:24225179

  7. Variations on a Theme.

    ERIC Educational Resources Information Center

    Vitali, Julius

    1990-01-01

    Explains an experimental photographic technique starting with a realistic photograph. Using various media (oil painting, video/computer photography, and multiprint imagery) the artist changes the photograph's compositional elements. Outlines the phases of this evolutionary process. Illustrates four images created by the technique. (DB)

  8. Numerical simulation of evolutionary erodible bedforms using the particle finite element method

    NASA Astrophysics Data System (ADS)

    Bravo, Rafael; Becker, Pablo; Ortiz, Pablo

    2017-07-01

    This paper presents a numerical strategy for the simulation of flows with evolutionary erodible boundaries. The fluid equations are fully resolved in 3D, while the sediment transport is modelled using the Exner equation and solved with an explicit Lagrangian procedure based on a fixed 2D mesh. Flow and sediment are coupled in geometry by deforming the fluid mesh in the vertical direction and in velocities with the experimental sediment flux computed using the Meyer Peter Müller model. A comparison with real experiments on channels is performed, giving good agreement.

  9. Application of Neural Network Optimized by Mind Evolutionary Computation in Building Energy Prediction

    NASA Astrophysics Data System (ADS)

    Song, Chen; Zhong-Cheng, Wu; Hong, Lv

    2018-03-01

    Building Energy forecasting plays an important role in energy management and plan. Using mind evolutionary algorithm to find the optimal network weights and threshold, to optimize the BP neural network, can overcome the problem of the BP neural network into a local minimum point. The optimized network is used for time series prediction, and the same month forecast, to get two predictive values. Then two kinds of predictive values are put into neural network, to get the final forecast value. The effectiveness of the method was verified by experiment with the energy value of three buildings in Hefei.

  10. Geometric morphometrics and virtual anthropology: advances in human evolutionary studies.

    PubMed

    Rein, Thomas R; Harvati, Katerina

    2014-01-01

    Geometric morphometric methods have been increasingly used in paleoanthropology in the last two decades, lending greater power to the analysis and interpretation of the human fossil record. More recently the advent of the wide use of computed tomography and surface scanning, implemented in combination with geometric morphometrics (GM), characterizes a new approach, termed Virtual Anthropology (VA). These methodological advances have led to a number of developments in human evolutionary studies. We present some recent examples of GM and VA related research in human evolution with an emphasis on work conducted at the University of Tübingen and other German research institutions.

  11. VizieR Online Data Catalog: Low-mass helium white dwarfs evolutionary models (Istrate+, 2016)

    NASA Astrophysics Data System (ADS)

    Istrate, A.; Marchant, P.; Tauris, T. M.; Langer, N.; Stancliffe, R. J.; Grassitelli, L.

    2016-07-01

    Evolutionary models of low-mass helium white dwarfs including element diffusion and rotational mixing. The WDs are produced considering binary evolution through the LMXB channel, with final WDs masses between ~0.16-~0.44. The models are computed using MESA, for different metallicities: Z=0.02, 0.01, 0.001 and 0.0002. For each metallicity, the models are divided in three categories: (1) basic (no diffusion nor rotation are considered) (2) diffusion (element diffusion is considered) (3) rotation+diffusion (both element diffusion and rotational mixing are considered) (4 data files).

  12. On the design of neuro-controllers for individual and social learning behaviour in autonomous robots: an evolutionary approach

    NASA Astrophysics Data System (ADS)

    Pini, Giovanni; Tuci, Elio

    2008-06-01

    In biology/psychology, the capability of natural organisms to learn from the observation/interaction with conspecifics is referred to as social learning. Roboticists have recently developed an interest in social learning, since it might represent an effective strategy to enhance the adaptivity of a team of autonomous robots. In this study, we show that a methodological approach based on artifcial neural networks shaped by evolutionary computation techniques can be successfully employed to synthesise the individual and social learning mechanisms for robots required to learn a desired action (i.e. phototaxis or antiphototaxis).

  13. Evolutionary Trails of Plant Group II Pyridoxal Phosphate-Dependent Decarboxylase Genes.

    PubMed

    Kumar, Rahul

    2016-01-01

    Type II pyridoxal phosphate-dependent decarboxylase (PLP_deC) enzymes play important metabolic roles during nitrogen metabolism. Recent evolutionary profiling of these genes revealed a sharp expansion of histidine decarboxylase genes in the members of Solanaceae family. In spite of the high sequence homology shared by PLP_deC orthologs, these enzymes display remarkable differences in their substrate specificities. Currently, limited information is available on the gene repertoires and substrate specificities of PLP_deCs which renders their precise annotation challenging and offers technical challenges in the immediate identification and biochemical characterization of their full gene complements in plants. Herein, we explored their evolutionary trails in a comprehensive manner by taking advantage of high-throughput data accessibility and computational approaches. We discussed the premise that has enabled an improved reconstruction of their evolutionary lineage and evaluated the factors offering constraints in their rapid functional characterization, till date. We envisage that the synthesized information herein would act as a catalyst for the rapid exploration of their biochemical specificity and physiological roles in more plant species.

  14. Evolutionary design optimization of traffic signals applied to Quito city.

    PubMed

    Armas, Rolando; Aguirre, Hernán; Daolio, Fabio; Tanaka, Kiyoshi

    2017-01-01

    This work applies evolutionary computation and machine learning methods to study the transportation system of Quito from a design optimization perspective. It couples an evolutionary algorithm with a microscopic transport simulator and uses the outcome of the optimization process to deepen our understanding of the problem and gain knowledge about the system. The work focuses on the optimization of a large number of traffic lights deployed on a wide area of the city and studies their impact on travel time, emissions and fuel consumption. An evolutionary algorithm with specialized mutation operators is proposed to search effectively in large decision spaces, evolving small populations for a short number of generations. The effects of the operators combined with a varying mutation schedule are studied, and an analysis of the parameters of the algorithm is also included. In addition, hierarchical clustering is performed on the best solutions found in several runs of the algorithm. An analysis of signal clusters and their geolocation, estimation of fuel consumption, spatial analysis of emissions, and an analysis of signal coordination provide an overall picture of the systemic effects of the optimization process.

  15. Evolutionary design optimization of traffic signals applied to Quito city

    PubMed Central

    2017-01-01

    This work applies evolutionary computation and machine learning methods to study the transportation system of Quito from a design optimization perspective. It couples an evolutionary algorithm with a microscopic transport simulator and uses the outcome of the optimization process to deepen our understanding of the problem and gain knowledge about the system. The work focuses on the optimization of a large number of traffic lights deployed on a wide area of the city and studies their impact on travel time, emissions and fuel consumption. An evolutionary algorithm with specialized mutation operators is proposed to search effectively in large decision spaces, evolving small populations for a short number of generations. The effects of the operators combined with a varying mutation schedule are studied, and an analysis of the parameters of the algorithm is also included. In addition, hierarchical clustering is performed on the best solutions found in several runs of the algorithm. An analysis of signal clusters and their geolocation, estimation of fuel consumption, spatial analysis of emissions, and an analysis of signal coordination provide an overall picture of the systemic effects of the optimization process. PMID:29236733

  16. Underlying Principles of Natural Selection in Network Evolution: Systems Biology Approach

    PubMed Central

    Chen, Bor-Sen; Wu, Wei-Sheng

    2007-01-01

    Systems biology is a rapidly expanding field that integrates diverse areas of science such as physics, engineering, computer science, mathematics, and biology toward the goal of elucidating the underlying principles of hierarchical metabolic and regulatory systems in the cell, and ultimately leading to predictive understanding of cellular response to perturbations. Because post-genomics research is taking place throughout the tree of life, comparative approaches offer a way for combining data from many organisms to shed light on the evolution and function of biological networks from the gene to the organismal level. Therefore, systems biology can build on decades of theoretical work in evolutionary biology, and at the same time evolutionary biology can use the systems biology approach to go in new uncharted directions. In this study, we present a review of how the post-genomics era is adopting comparative approaches and dynamic system methods to understand the underlying design principles of network evolution and to shape the nascent field of evolutionary systems biology. Finally, the application of evolutionary systems biology to robust biological network designs is also discussed from the synthetic biology perspective. PMID:19468310

  17. Evolutionary game theory using agent-based methods.

    PubMed

    Adami, Christoph; Schossau, Jory; Hintze, Arend

    2016-12-01

    Evolutionary game theory is a successful mathematical framework geared towards understanding the selective pressures that affect the evolution of the strategies of agents engaged in interactions with potential conflicts. While a mathematical treatment of the costs and benefits of decisions can predict the optimal strategy in simple settings, more realistic settings such as finite populations, non-vanishing mutations rates, stochastic decisions, communication between agents, and spatial interactions, require agent-based methods where each agent is modeled as an individual, carries its own genes that determine its decisions, and where the evolutionary outcome can only be ascertained by evolving the population of agents forward in time. While highlighting standard mathematical results, we compare those to agent-based methods that can go beyond the limitations of equations and simulate the complexity of heterogeneous populations and an ever-changing set of interactors. We conclude that agent-based methods can predict evolutionary outcomes where purely mathematical treatments cannot tread (for example in the weak selection-strong mutation limit), but that mathematics is crucial to validate the computational simulations. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Reducing and Analyzing the PHAT Survey with the Cloud

    NASA Astrophysics Data System (ADS)

    Williams, Benjamin F.; Olsen, Knut; Khan, Rubab; Pirone, Daniel; Rosema, Keith

    2018-05-01

    We discuss the technical challenges we faced and the techniques we used to overcome them when reducing the Panchromatic Hubble Andromeda Treasury (PHAT) photometric data set on the Amazon Elastic Compute Cloud (EC2). We first describe the architecture of our photometry pipeline, which we found particularly efficient for reducing the data in multiple ways for different purposes. We then describe the features of EC2 that make this architecture both efficient to use and challenging to implement. We describe the techniques we adopted to process our data, and suggest ways these techniques may be improved for those interested in trying such reductions in the future. Finally, we summarize the output photometry data products, which are now hosted publicly in two places in two formats. They are in simple fits tables in the high-level science products on MAST, and on a queryable database available through the NOAO Data Lab.

  19. β-decay half-life of V50 calculated by the shell model

    NASA Astrophysics Data System (ADS)

    Haaranen, M.; Srivastava, P. C.; Suhonen, J.; Zuber, K.

    2014-10-01

    In this work we survey the detectability of the β- channel of 2350V leading to the first excited 2+ state in 2450Cr. The electron-capture (EC) half-life corresponding to the transition of 2350V to the first excited 2+ state in 2250Ti had been measured earlier. Both of the mentioned transitions are 4th-forbidden non-unique. We have performed calculations of all the involved wave functions by using the nuclear shell model with the GXPF1A interaction in the full f-p shell. The computed half-life of the EC branch is in good agreement with the measured one. The predicted half-life for the β- branch is in the range ≈2×1019 yr whereas the present experimental lower limit is 1.5×1018 yr. We discuss also the experimental lay-out needed to detect the β--branch decay.

  20. Evolution-Inspired Computational Design of Symmetric Proteins.

    PubMed

    Voet, Arnout R D; Simoncini, David; Tame, Jeremy R H; Zhang, Kam Y J

    2017-01-01

    Monomeric proteins with a number of identical repeats creating symmetrical structures are potentially very valuable building blocks with a variety of bionanotechnological applications. As such proteins do not occur naturally, the emerging field of computational protein design serves as an excellent tool to create them from nonsymmetrical templates. Existing pseudo-symmetrical proteins are believed to have evolved from oligomeric precursors by duplication and fusion of identical repeats. Here we describe a computational workflow to reverse-engineer this evolutionary process in order to create stable proteins consisting of identical sequence repeats.

  1. Computers in health care for the 21st century.

    PubMed

    O'Desky, R I; Ball, M J; Ball, E E

    1990-03-01

    As the world enters the last decade of the 20th Century, there is a great deal of speculation about the effect of computers on the future delivery of health care. In this article, the authors attempt to identify some of the evolving computer technologies and anticipate what effect they will have by the year 2000. Rather than listing potential accomplishments, each of the affected areas: hardware, software, health care systems and communications, are presented in an evolutionary manner so the reader can better appreciate where we have been and where we are going.

  2. Evolutionary and biological metaphors for engineering design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakiela, M.

    1994-12-31

    Since computing became generally available, there has been strong interest in using computers to assist and automate engineering design processes. Specifically, for design optimization and automation, nonlinear programming and artificial intelligence techniques have been extensively studied. New computational techniques, based upon the natural processes of evolution, adaptation, and learing, are showing promise because of their generality and robustness. This presentation will describe the use of two such techniques, genetic algorithms and classifier systems, for a variety of engineering design problems. Structural topology optimization, meshing, and general engineering optimization are shown as example applications.

  3. Launching "the evolution of cooperation".

    PubMed

    Axelrod, Robert

    2012-04-21

    This article describes three aspects of the author's early work on the evolution of the cooperation. First, it explains how the idea for a computer tournament for the iterated Prisoner's Dilemma was inspired by the artificial intelligence research on computer checkers and computer chess. Second, it shows how the vulnerability of simple reciprocity of misunderstanding or misimplementation can be eliminated with the addition of some degree of generosity or contrition. Third, it recounts the unusual collaboration between the author, a political scientist, and William D. Hamilton, an evolutionary biologist. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Simultaneously estimating evolutionary history and repeated traits phylogenetic signal: applications to viral and host phenotypic evolution

    PubMed Central

    Vrancken, Bram; Lemey, Philippe; Rambaut, Andrew; Bedford, Trevor; Longdon, Ben; Günthard, Huldrych F.; Suchard, Marc A.

    2014-01-01

    Phylogenetic signal quantifies the degree to which resemblance in continuously-valued traits reflects phylogenetic relatedness. Measures of phylogenetic signal are widely used in ecological and evolutionary research, and are recently gaining traction in viral evolutionary studies. Standard estimators of phylogenetic signal frequently condition on data summary statistics of the repeated trait observations and fixed phylogenetics trees, resulting in information loss and potential bias. To incorporate the observation process and phylogenetic uncertainty in a model-based approach, we develop a novel Bayesian inference method to simultaneously estimate the evolutionary history and phylogenetic signal from molecular sequence data and repeated multivariate traits. Our approach builds upon a phylogenetic diffusion framework that model continuous trait evolution as a Brownian motion process and incorporates Pagel’s λ transformation parameter to estimate dependence among traits. We provide a computationally efficient inference implementation in the BEAST software package. We evaluate the synthetic performance of the Bayesian estimator of phylogenetic signal against standard estimators, and demonstrate the use of our coherent framework to address several virus-host evolutionary questions, including virulence heritability for HIV, antigenic evolution in influenza and HIV, and Drosophila sensitivity to sigma virus infection. Finally, we discuss model extensions that will make useful contributions to our flexible framework for simultaneously studying sequence and trait evolution. PMID:25780554

  5. SU-E-T-314: The Application of Cloud Computing in Pencil Beam Scanning Proton Therapy Monte Carlo Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Z; Gao, M

    Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster softwaremore » developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm{sup 2}, 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers.« less

  6. Are Cloud Environments Ready for Scientific Applications?

    NASA Astrophysics Data System (ADS)

    Mehrotra, P.; Shackleford, K.

    2011-12-01

    Cloud computing environments are becoming widely available both in the commercial and government sectors. They provide flexibility to rapidly provision resources in order to meet dynamic and changing computational needs without the customers incurring capital expenses and/or requiring technical expertise. Clouds also provide reliable access to resources even though the end-user may not have in-house expertise for acquiring or operating such resources. Consolidation and pooling in a cloud environment allow organizations to achieve economies of scale in provisioning or procuring computing resources and services. Because of these and other benefits, many businesses and organizations are migrating their business applications (e.g., websites, social media, and business processes) to cloud environments-evidenced by the commercial success of offerings such as the Amazon EC2. In this paper, we focus on the feasibility of utilizing cloud environments for scientific workloads and workflows particularly of interest to NASA scientists and engineers. There is a wide spectrum of such technical computations. These applications range from small workstation-level computations to mid-range computing requiring small clusters to high-performance simulations requiring supercomputing systems with high bandwidth/low latency interconnects. Data-centric applications manage and manipulate large data sets such as satellite observational data and/or data previously produced by high-fidelity modeling and simulation computations. Most of the applications are run in batch mode with static resource requirements. However, there do exist situations that have dynamic demands, particularly ones with public-facing interfaces providing information to the general public, collaborators and partners, as well as to internal NASA users. In the last few months we have been studying the suitability of cloud environments for NASA's technical and scientific workloads. We have ported several applications to multiple cloud environments including NASA's Nebula environment, Amazon's EC2, Magellan at NERSC, and SGI's Cyclone system. We critically examined the performance of the applications on these systems. We also collected information on the usability of these cloud environments. In this talk we will present the results of our study focusing on the efficacy of using clouds for NASA's scientific applications.

  7. Characteristics and sources of the fine carbonaceous aerosols in Haikou, China

    NASA Astrophysics Data System (ADS)

    Liu, Baoshuang; Zhang, Jiaying; Wang, Lu; Liang, Danni; Cheng, Yuan; Wu, Jianhui; Bi, Xiaohui; Feng, Yinchang; Zhang, Yufen; Yang, Haihang

    2018-01-01

    Ambient PM2.5 samples were collected from January to September 2015 in Haikou. The carbonaceous fractions included OC, EC, OC1, OC2, OC3, OC4, EC1, EC2, EC3, Char-EC (EC1 minus POC) and Soot-EC (EC2 plus EC3) were analysed in this study. The results indicate that the mean concentrations of OC and EC are 5.6 and 2.5 μg/m3 during the sampling period, respectively; and the concentrations of most of carbonaceous fractions are the highest in winter and the lowest in spring. The seasonal variations of Soot-EC and Char-EC concentrations show distinct differences. The concentrations of Char-EC are higher in winter and lower in spring; while those of Soot-EC are lower in winter and higher in summer. Compared to Char-EC, the concentrations of Soot-EC show smaller seasonal-variation in Haikou. The Char-EC has the higher correlations with OC and EC (r = 0.91 and 0.95, P < 0.01), while the correlation between the Soot-EC and either OC or EC is absent (r = 0.15 and 0.11, P > 0.05). The average ratios of Char-EC/Soot-EC are in the order of winter (15.9) > autumn (4.9) > summer (4.0) > spring (3.6), with an average value of 7.1. According to error estimation (EE) diagnostics analysis, four factors are revealed in Positive Matrix Factorization (PMF) analysis during each season. The combined gasoline/diesel vehicle exhaust, coal combustion, biomass burning and specific diesel vehicle exhaust are identified as the major sources of carbonaceous aerosols, and their contributions during the whole year are up to 29.3%, 27.4%, 17.9% and 15.9%, respectively. The transport trajectories of the air masses illustrate distinct differences during different seasons, and the transport trajectories are mainly derived from the mainland China (i.e. Jiangxi, Fujian and Guangdong provinces) in winter, likely caused by higher contribution of coal combustion.

  8. Multidisciplinary Approaches in Evolutionary Linguistics

    ERIC Educational Resources Information Center

    Gong, Tao; Shuai, Lan; Wu, Yicheng

    2013-01-01

    Studying language evolution has become resurgent in modern scientific research. In this revival field, approaches from a number of disciplines other than linguistics, including (paleo)anthropology and archaeology, animal behaviors, genetics, neuroscience, computer simulation, and psychological experimentation, have been adopted, and a wide scope…

  9. Measuring eating competence: psychometric properties and validity of the ecSatter Inventory.

    PubMed

    Lohse, Barbara; Satter, Ellyn; Horacek, Tanya; Gebreselassie, Tesfayi; Oakland, Mary Jane

    2007-01-01

    Assess validity of the ecSatter Inventory (ecSI) to measure eating competence (EC). Concurrent administration of ecSI with validated measures of eating behaviors using on-line and paper-pencil formats. The on-line survey was completed by 370 participants; 462 completed the paper version. Participants included 863 adults with 832 usable surveys from respondents (mean age 36.2 +/- 13.4 years) without eating disorders, mostly female, white, educated, overweight, physically active, and food secure. Of those indicating intent to complete the on-line survey, 80.3% did so; 54% of mailed surveys were returned. Eating and food behaviors compared among EC tertiles and between dichotomous EC categories; internal consistency of ecSI. Analysis of variance, independent t tests, chi-square, factor analysis, logistic regression. Significance level was P < .05. Mean ecSI score was 31.1 +/- 7.5. ecSI included 4 subscales with internal reliability and content validity. Construct validity was supported by specific behavioral profiles for ecSI tertiles and ecSI dichotomized categories. Persons unsatisfied with weight were 54% less likely to be EC; unit increase in the food like index was associated with nearly 3 times greater likelihood of being EC. The ecSatter Inventory is a valid measure of EC and can be used for descriptive and outcome measurements.

  10. Damage Detection Response Characteristics of Open Circuit Resonant (SansEC) Sensors

    NASA Technical Reports Server (NTRS)

    Dudley, Kenneth L.; Szatkowski, George N.; Smith, Laura J.; Koppen, Sandra V.; Ely, Jay J.; Nguyen, Truong X.; Wang, Chuantong; Ticatch, Larry A.; Mielnik, John J.

    2013-01-01

    The capability to assess the current or future state of the health of an aircraft to improve safety, availability, and reliability while reducing maintenance costs has been a continuous goal for decades. Many companies, commercial entities, and academic institutions have become interested in Integrated Vehicle Health Management (IVHM) and a growing effort of research into "smart" vehicle sensing systems has emerged. Methods to detect damage to aircraft materials and structures have historically relied on visual inspection during pre-flight or post-flight operations by flight and ground crews. More quantitative non-destructive investigations with various instruments and sensors have traditionally been performed when the aircraft is out of operational service during major scheduled maintenance. Through the use of reliable sensors coupled with data monitoring, data mining, and data analysis techniques, the health state of a vehicle can be detected in-situ. NASA Langley Research Center (LaRC) is developing a composite aircraft skin damage detection method and system based on open circuit SansEC (Sans Electric Connection) sensor technology. Composite materials are increasingly used in modern aircraft for reducing weight, improving fuel efficiency, and enhancing the overall design, performance, and manufacturability of airborne vehicles. Materials such as fiberglass reinforced composites (FRC) and carbon-fiber-reinforced polymers (CFRP) are being used to great advantage in airframes, wings, engine nacelles, turbine blades, fairings, fuselage structures, empennage structures, control surfaces and aircraft skins. SansEC sensor technology is a new technical framework for designing, powering, and interrogating sensors to detect various types of damage in composite materials. The source cause of the in-service damage (lightning strike, impact damage, material fatigue, etc.) to the aircraft composite is not relevant. The sensor will detect damage independent of the cause. Damage in composite material is generally associated with a localized change in material permittivity and/or conductivity. These changes are sensed using SansEC. The unique electrical signatures (amplitude, frequency, bandwidth, and phase) are used for damage detection and diagnosis. An operational system and method would incorporate a SansEC sensor array on select areas of the aircraft exterior surfaces to form a "Smart skin" sensing surface. In this paper a new method and system for aircraft in-situ damage detection and diagnosis is presented. Experimental test results on seeded fault damage coupons and computational modeling simulation results are presented. NASA LaRC has demonstrated with individual sensors that SansEC sensors can be effectively used for in-situ composite damage detection of delamination, voids, fractures, and rips. Keywords: Damage Detection, Composites, Integrated Vehicle Health Monitoring (IVHM), Aviation Safety, SansEC Sensors

  11. ATLAS@AWS

    NASA Astrophysics Data System (ADS)

    Gehrcke, Jan-Philip; Kluth, Stefan; Stonjek, Stefan

    2010-04-01

    We show how the ATLAS offline software is ported on the Amazon Elastic Compute Cloud (EC2). We prepare an Amazon Machine Image (AMI) on the basis of the standard ATLAS platform Scientific Linux 4 (SL4). Then an instance of the SLC4 AMI is started on EC2 and we install and validate a recent release of the ATLAS offline software distribution kit. The installed software is archived as an image on the Amazon Simple Storage Service (S3) and can be quickly retrieved and connected to new SL4 AMI instances using the Amazon Elastic Block Store (EBS). ATLAS jobs can then configure against the release kit using the ATLAS configuration management tool (cmt) in the standard way. The output of jobs is exported to S3 before the SL4 AMI is terminated. Job status information is transferred to the Amazon SimpleDB service. The whole process of launching instances of our AMI, starting, monitoring and stopping jobs and retrieving job output from S3 is controlled from a client machine using python scripts implementing the Amazon EC2/S3 API via the boto library working together with small scripts embedded in the SL4 AMI. We report our experience with setting up and operating the system using standard ATLAS job transforms.

  12. CFD Analysis of Thermal Control System Using NX Thermal and Flow

    NASA Technical Reports Server (NTRS)

    Fortier, C. R.; Harris, M. F. (Editor); McConnell, S. (Editor)

    2014-01-01

    The Thermal Control Subsystem (TCS) is a key part of the Advanced Plant Habitat (APH) for the International Space Station (ISS). The purpose of this subsystem is to provide thermal control, mainly cooling, to the other APH subsystems. One of these subsystems, the Environmental Control Subsystem (ECS), controls the temperature and humidity of the growth chamber (GC) air to optimize the growth of plants in the habitat. The TCS provides thermal control to the ECS with three cold plates, which use Thermoelectric Coolers (TECs) to heat or cool water as needed to control the air temperature in the ECS system. In order to optimize the TCS design, pressure drop and heat transfer analyses were needed. The analysis for this system was performed in Siemens NX Thermal/Flow software (Version 8.5). NX Thermal/Flow has the ability to perform 1D or 3D flow solutions. The 1D flow solver can be used to represent simple geometries, such as pipes and tubes. The 1D flow method also has the ability to simulate either fluid only or fluid and wall regions. The 3D flow solver is similar to other Computational Fluid Dynamic (CFD) software. TCS performance was analyzed using both the 1D and 3D solvers. Each method produced different results, which will be evaluated and discussed.

  13. Cloud computing task scheduling strategy based on improved differential evolution algorithm

    NASA Astrophysics Data System (ADS)

    Ge, Junwei; He, Qian; Fang, Yiqiu

    2017-04-01

    In order to optimize the cloud computing task scheduling scheme, an improved differential evolution algorithm for cloud computing task scheduling is proposed. Firstly, the cloud computing task scheduling model, according to the model of the fitness function, and then used improved optimization calculation of the fitness function of the evolutionary algorithm, according to the evolution of generation of dynamic selection strategy through dynamic mutation strategy to ensure the global and local search ability. The performance test experiment was carried out in the CloudSim simulation platform, the experimental results show that the improved differential evolution algorithm can reduce the cloud computing task execution time and user cost saving, good implementation of the optimal scheduling of cloud computing tasks.

  14. Attenuation of Hind-Limb Ischemia in Mice with Endothelial-Like Cells Derived from Different Sources of Human Stem Cells

    PubMed Central

    Chan, Yau-Chi; Ng, Joyce H. L.; Au, Ka-Wing; Wong, Lai-Yung; Siu, Chung-Wah; Tse, Hung-Fat

    2013-01-01

    Functional endothelial-like cells (EC) have been successfully derived from different cell sources and potentially used for treatment of cardiovascular diseases; however, their relative therapeutic efficacy remains unclear. We differentiated functional EC from human bone marrow mononuclear cells (BM-EC), human embryonic stem cells (hESC-EC) and human induced pluripotent stem cells (hiPSC-EC), and compared their in-vitro tube formation, migration and cytokine expression profiles, and in-vivo capacity to attenuate hind-limb ischemia in mice. Successful differentiation of BM-EC was only achieved in 1/6 patient with severe coronary artery disease. Nevertheless, BM-EC, hESC-EC and hiPSC-EC exhibited typical cobblestone morphology, had the ability of uptaking DiI-labeled acetylated low-density-lipoprotein, and binding of Ulex europaeus lectin. In-vitro functional assay demonstrated that hiPSC-EC and hESC-EC had similar capacity for tube formation and migration as human umbilical cord endothelial cells (HUVEC) and BM-EC (P>0.05). While increased expression of major angiogenic factors including epidermal growth factor, hepatocyte growth factor, vascular endothelial growth factor, placental growth factor and stromal derived factor-1 were observed in all EC cultures during hypoxia compared with normoxia (P<0.05), the magnitudes of cytokine up-regulation upon hypoxic were more dramatic in hiPSC-EC and hESC-EC (P<0.05). Compared with medium, transplanting BM-EC (n = 6), HUVEC (n = 6), hESC-EC (n = 8) or hiPSC-EC (n = 8) significantly attenuated severe hind-limb ischemia in mice via enhancement of neovascularization. In conclusion, functional EC can be generated from hECS and hiPSC with similar therapeutic efficacy for attenuation of severe hind-limb ischemia. Differentiation of functional BM-EC was more difficult to achieve in patients with cardiovascular diseases, and hESC-EC or iPSC-EC are readily available as “off-the-shelf” format for the treatment of tissue ischemia. PMID:23472116

  15. Molecular Signatures of Tissue-Specific Microvascular Endothelial Cell Heterogeneity in Organ Maintenance and Regeneration

    PubMed Central

    Nolan, Daniel J.; Ginsberg, Michael; Israely, Edo; Palikuqi, Brisa; Poulos, Michael G.; James, Daylon; Ding, Bi-Sen; Schachterle, William; Liu, Ying; Rosenwaks, Zev; Butler, Jason M.; Xiang, Jenny; Rafii, Arash; Shido, Koji; Rabbany, Sina Y.; Elemento, Olivier; Rafii, Shahin

    2013-01-01

    SUMMARY Microvascular endothelial cells (ECs) within different tissues are endowed with distinct but as yet unrecognized structural, phenotypic, and functional attributes. We devised EC purification, cultivation, profiling, and transplantation models that establish tissue-specific molecular libraries of ECs devoid of lymphatic ECs or parenchymal cells. These libraries identify attributes that confer ECs with their organotypic features. We show that clusters of transcription factors, angiocrine growth factors, adhesion molecules, and chemokines are expressed in unique combinations by ECs of each organ. Furthermore, ECs respond distinctly in tissue regeneration models, hepatectomy, and myeloablation. To test the data set, we developed a transplantation model that employs generic ECs differentiated from embryonic stem cells. Transplanted generic ECs engraft into regenerating tissues and acquire features of organotypic ECs. Collectively, we demonstrate the utility of informational databases of ECs toward uncovering the extravascular and intrinsic signals that define EC heterogeneity. These factors could be exploited therapeutically to engineer tissue-specific ECs for regeneration. PMID:23871589

  16. Reactions of singly-reduced ethylene carbonate in lithium battery electrolytes: a molecular dynamics simulation study using the ReaxFF.

    PubMed

    Bedrov, Dmitry; Smith, Grant D; van Duin, Adri C T

    2012-03-22

    We have conducted quantum chemistry calculations and gas- and solution-phase reactive molecular dynamics simulation studies of reactions involving the ethylene carbonate (EC) radical anion EC(-) using the reactive force field ReaxFF. Our studies reveal that the substantial barrier for transition from the closed (cyclic) form, denoted c-EC(-), of the radical anion to the linear (open) form, denoted o-EC(-), results in a relatively long lifetime of the c-EC(-) allowing this compound to react with other singly reduced alkyl carbonates. Using ReaxFF, we systematically investigate the fate of both c-EC(-) and o-EC(-) in the gas phase and EC solution. In the gas phase and EC solutions with a relatively low concentration of Li(+)/x-EC(-) (where x = o or c), radical termination reactions between radical pairs to form either dilithium butylene dicarbonate (CH(2)CH(2)OCO(2)Li)(2) (by reacting two Li(+)/o-EC(-)) or ester-carbonate compound (by reacting Li(+)/o-EC(-) with Li(+)/c-EC(-)) are observed. At higher concentrations of Li(+)/x-EC(-) in solution, we observe the formation of diradicals which subsequently lead to formation of longer alkyl carbonates oligomers through reaction with other radicals or, in some cases, formation of (CH(2)OCO(2)Li)(2) through elimination of C(2)H(4). We conclude that the local ionic concentration is important in determining the fate of x-EC(-) and that the reaction of c-EC(-) with o-EC(-) may compete with the formation of various alkyl carbonates from o-EC(-)/o-EC(-) reactions. © 2012 American Chemical Society

  17. Evolutionary dynamics on any population structure

    NASA Astrophysics Data System (ADS)

    Allen, Benjamin; Lippner, Gabor; Chen, Yu-Ting; Fotouhi, Babak; Momeni, Naghmeh; Yau, Shing-Tung; Nowak, Martin A.

    2017-03-01

    Evolution occurs in populations of reproducing individuals. The structure of a population can affect which traits evolve. Understanding evolutionary game dynamics in structured populations remains difficult. Mathematical results are known for special structures in which all individuals have the same number of neighbours. The general case, in which the number of neighbours can vary, has remained open. For arbitrary selection intensity, the problem is in a computational complexity class that suggests there is no efficient algorithm. Whether a simple solution for weak selection exists has remained unanswered. Here we provide a solution for weak selection that applies to any graph or network. Our method relies on calculating the coalescence times of random walks. We evaluate large numbers of diverse population structures for their propensity to favour cooperation. We study how small changes in population structure—graph surgery—affect evolutionary outcomes. We find that cooperation flourishes most in societies that are based on strong pairwise ties.

  18. Integrating instance selection, instance weighting, and feature weighting for nearest neighbor classifiers by coevolutionary algorithms.

    PubMed

    Derrac, Joaquín; Triguero, Isaac; Garcia, Salvador; Herrera, Francisco

    2012-10-01

    Cooperative coevolution is a successful trend of evolutionary computation which allows us to define partitions of the domain of a given problem, or to integrate several related techniques into one, by the use of evolutionary algorithms. It is possible to apply it to the development of advanced classification methods, which integrate several machine learning techniques into a single proposal. A novel approach integrating instance selection, instance weighting, and feature weighting into the framework of a coevolutionary model is presented in this paper. We compare it with a wide range of evolutionary and nonevolutionary related methods, in order to show the benefits of the employment of coevolution to apply the techniques considered simultaneously. The results obtained, contrasted through nonparametric statistical tests, show that our proposal outperforms other methods in the comparison, thus becoming a suitable tool in the task of enhancing the nearest neighbor classifier.

  19. Turbopump Performance Improved by Evolutionary Algorithms

    NASA Technical Reports Server (NTRS)

    Oyama, Akira; Liou, Meng-Sing

    2002-01-01

    The development of design optimization technology for turbomachinery has been initiated using the multiobjective evolutionary algorithm under NASA's Intelligent Synthesis Environment and Revolutionary Aeropropulsion Concepts programs. As an alternative to the traditional gradient-based methods, evolutionary algorithms (EA's) are emergent design-optimization algorithms modeled after the mechanisms found in natural evolution. EA's search from multiple points, instead of moving from a single point. In addition, they require no derivatives or gradients of the objective function, leading to robustness and simplicity in coupling any evaluation codes. Parallel efficiency also becomes very high by using a simple master-slave concept for function evaluations, since such evaluations often consume the most CPU time, such as computational fluid dynamics. Application of EA's to multiobjective design problems is also straightforward because EA's maintain a population of design candidates in parallel. Because of these advantages, EA's are a unique and attractive approach to real-world design optimization problems.

  20. Evolution of cyclohexadienyl dehydratase from an ancestral solute-binding protein.

    PubMed

    Clifton, Ben E; Kaczmarski, Joe A; Carr, Paul D; Gerth, Monica L; Tokuriki, Nobuhiko; Jackson, Colin J

    2018-04-23

    The emergence of enzymes through the neofunctionalization of noncatalytic proteins is ultimately responsible for the extraordinary range of biological catalysts observed in nature. Although the evolution of some enzymes from binding proteins can be inferred by homology, we have a limited understanding of the nature of the biochemical and biophysical adaptations along these evolutionary trajectories and the sequence in which they occurred. Here we reconstructed and characterized evolutionary intermediate states linking an ancestral solute-binding protein to the extant enzyme cyclohexadienyl dehydratase. We show how the intrinsic reactivity of a desolvated general acid was harnessed by a series of mutations radiating from the active site, which optimized enzyme-substrate complementarity and transition-state stabilization and minimized sampling of noncatalytic conformations. Our work reveals the molecular evolutionary processes that underlie the emergence of enzymes de novo, which are notably mirrored by recent examples of computational enzyme design and directed evolution.

Top